Oct 14 13:06:10.603700 master-2 systemd[1]: Starting Kubernetes Kubelet... Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 14 13:06:11.181118 master-2 kubenswrapper[4762]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 14 13:06:11.182618 master-2 kubenswrapper[4762]: I1014 13:06:11.182122 4762 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185699 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185723 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185730 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185735 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185742 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185747 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:06:11.185740 master-2 kubenswrapper[4762]: W1014 13:06:11.185753 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185761 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185768 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185775 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185780 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185786 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185791 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185797 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185802 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185808 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185813 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185826 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185833 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185838 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185844 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185849 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185854 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185860 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185865 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:06:11.186109 master-2 kubenswrapper[4762]: W1014 13:06:11.185870 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185876 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185881 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185886 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185892 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185897 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185904 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185910 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185915 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185921 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185927 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185932 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185938 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185944 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185950 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185955 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185961 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185969 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185976 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:06:11.187203 master-2 kubenswrapper[4762]: W1014 13:06:11.185981 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.185987 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.185993 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.185998 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186003 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186009 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186014 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186019 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186024 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186029 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186035 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186040 4762 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186045 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186052 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186058 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186063 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186070 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186076 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186081 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186086 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:06:11.188104 master-2 kubenswrapper[4762]: W1014 13:06:11.186091 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186097 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186101 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186108 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186113 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186119 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186125 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: W1014 13:06:11.186131 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187107 4762 flags.go:64] FLAG: --address="0.0.0.0" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187122 4762 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187134 4762 flags.go:64] FLAG: --anonymous-auth="true" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187143 4762 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187170 4762 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187177 4762 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187186 4762 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187194 4762 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187200 4762 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187207 4762 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187215 4762 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187222 4762 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187229 4762 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187235 4762 flags.go:64] FLAG: --cgroup-root="" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187242 4762 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 14 13:06:11.189068 master-2 kubenswrapper[4762]: I1014 13:06:11.187248 4762 flags.go:64] FLAG: --client-ca-file="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187255 4762 flags.go:64] FLAG: --cloud-config="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187261 4762 flags.go:64] FLAG: --cloud-provider="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187267 4762 flags.go:64] FLAG: --cluster-dns="[]" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187275 4762 flags.go:64] FLAG: --cluster-domain="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187281 4762 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187288 4762 flags.go:64] FLAG: --config-dir="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187294 4762 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187301 4762 flags.go:64] FLAG: --container-log-max-files="5" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187309 4762 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187315 4762 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187322 4762 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187329 4762 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187335 4762 flags.go:64] FLAG: --contention-profiling="false" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187341 4762 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187350 4762 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187358 4762 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187364 4762 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187372 4762 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187379 4762 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187385 4762 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187391 4762 flags.go:64] FLAG: --enable-load-reader="false" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187397 4762 flags.go:64] FLAG: --enable-server="true" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187403 4762 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187412 4762 flags.go:64] FLAG: --event-burst="100" Oct 14 13:06:11.190401 master-2 kubenswrapper[4762]: I1014 13:06:11.187419 4762 flags.go:64] FLAG: --event-qps="50" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187425 4762 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187432 4762 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187438 4762 flags.go:64] FLAG: --eviction-hard="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187445 4762 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187452 4762 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187458 4762 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187464 4762 flags.go:64] FLAG: --eviction-soft="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187470 4762 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187476 4762 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187482 4762 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187489 4762 flags.go:64] FLAG: --experimental-mounter-path="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187495 4762 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187501 4762 flags.go:64] FLAG: --fail-swap-on="true" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187507 4762 flags.go:64] FLAG: --feature-gates="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187515 4762 flags.go:64] FLAG: --file-check-frequency="20s" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187521 4762 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187528 4762 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187535 4762 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187541 4762 flags.go:64] FLAG: --healthz-port="10248" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187547 4762 flags.go:64] FLAG: --help="false" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187554 4762 flags.go:64] FLAG: --hostname-override="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187561 4762 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187569 4762 flags.go:64] FLAG: --http-check-frequency="20s" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187575 4762 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 14 13:06:11.191861 master-2 kubenswrapper[4762]: I1014 13:06:11.187581 4762 flags.go:64] FLAG: --image-credential-provider-config="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187588 4762 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187595 4762 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187602 4762 flags.go:64] FLAG: --image-service-endpoint="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187608 4762 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187614 4762 flags.go:64] FLAG: --kube-api-burst="100" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187620 4762 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187628 4762 flags.go:64] FLAG: --kube-api-qps="50" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187634 4762 flags.go:64] FLAG: --kube-reserved="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187641 4762 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187646 4762 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187653 4762 flags.go:64] FLAG: --kubelet-cgroups="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187659 4762 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187665 4762 flags.go:64] FLAG: --lock-file="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187671 4762 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187677 4762 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187684 4762 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187693 4762 flags.go:64] FLAG: --log-json-split-stream="false" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187699 4762 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187706 4762 flags.go:64] FLAG: --log-text-split-stream="false" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187712 4762 flags.go:64] FLAG: --logging-format="text" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187718 4762 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187725 4762 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187731 4762 flags.go:64] FLAG: --manifest-url="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187738 4762 flags.go:64] FLAG: --manifest-url-header="" Oct 14 13:06:11.193496 master-2 kubenswrapper[4762]: I1014 13:06:11.187746 4762 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187752 4762 flags.go:64] FLAG: --max-open-files="1000000" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187760 4762 flags.go:64] FLAG: --max-pods="110" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187767 4762 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187773 4762 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187780 4762 flags.go:64] FLAG: --memory-manager-policy="None" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187786 4762 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187792 4762 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187799 4762 flags.go:64] FLAG: --node-ip="192.168.34.12" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187805 4762 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187820 4762 flags.go:64] FLAG: --node-status-max-images="50" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187826 4762 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187832 4762 flags.go:64] FLAG: --oom-score-adj="-999" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187840 4762 flags.go:64] FLAG: --pod-cidr="" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187846 4762 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2d66b9dbe1d071d7372c477a78835fb65b48ea82db00d23e9086af5cfcb194ad" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187856 4762 flags.go:64] FLAG: --pod-manifest-path="" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187862 4762 flags.go:64] FLAG: --pod-max-pids="-1" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187869 4762 flags.go:64] FLAG: --pods-per-core="0" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187875 4762 flags.go:64] FLAG: --port="10250" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187882 4762 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187889 4762 flags.go:64] FLAG: --provider-id="" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187895 4762 flags.go:64] FLAG: --qos-reserved="" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187901 4762 flags.go:64] FLAG: --read-only-port="10255" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187907 4762 flags.go:64] FLAG: --register-node="true" Oct 14 13:06:11.195131 master-2 kubenswrapper[4762]: I1014 13:06:11.187914 4762 flags.go:64] FLAG: --register-schedulable="true" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187920 4762 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187931 4762 flags.go:64] FLAG: --registry-burst="10" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187937 4762 flags.go:64] FLAG: --registry-qps="5" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187944 4762 flags.go:64] FLAG: --reserved-cpus="" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187950 4762 flags.go:64] FLAG: --reserved-memory="" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187958 4762 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187965 4762 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187971 4762 flags.go:64] FLAG: --rotate-certificates="false" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187978 4762 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187984 4762 flags.go:64] FLAG: --runonce="false" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187991 4762 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.187997 4762 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188004 4762 flags.go:64] FLAG: --seccomp-default="false" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188010 4762 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188016 4762 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188023 4762 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188029 4762 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188036 4762 flags.go:64] FLAG: --storage-driver-password="root" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188043 4762 flags.go:64] FLAG: --storage-driver-secure="false" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188049 4762 flags.go:64] FLAG: --storage-driver-table="stats" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188055 4762 flags.go:64] FLAG: --storage-driver-user="root" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188062 4762 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188069 4762 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188075 4762 flags.go:64] FLAG: --system-cgroups="" Oct 14 13:06:11.196768 master-2 kubenswrapper[4762]: I1014 13:06:11.188082 4762 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188092 4762 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188098 4762 flags.go:64] FLAG: --tls-cert-file="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188104 4762 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188112 4762 flags.go:64] FLAG: --tls-min-version="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188118 4762 flags.go:64] FLAG: --tls-private-key-file="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188125 4762 flags.go:64] FLAG: --topology-manager-policy="none" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188131 4762 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188138 4762 flags.go:64] FLAG: --topology-manager-scope="container" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188144 4762 flags.go:64] FLAG: --v="2" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188168 4762 flags.go:64] FLAG: --version="false" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188176 4762 flags.go:64] FLAG: --vmodule="" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188188 4762 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: I1014 13:06:11.188195 4762 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188361 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188368 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188375 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188381 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188387 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188393 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188398 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188406 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:06:11.198476 master-2 kubenswrapper[4762]: W1014 13:06:11.188414 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188420 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188425 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188432 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188437 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188443 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188448 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188458 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188463 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188468 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188473 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188479 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188484 4762 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188489 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188495 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188500 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188506 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188511 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188518 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188524 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:06:11.200099 master-2 kubenswrapper[4762]: W1014 13:06:11.188530 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188536 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188541 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188547 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188552 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188557 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188562 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188568 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188573 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188578 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188583 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188589 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188594 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188600 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188605 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188610 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188615 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188621 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188628 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:06:11.201151 master-2 kubenswrapper[4762]: W1014 13:06:11.188636 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188641 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188646 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188651 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188657 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188663 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188670 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188676 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188683 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188688 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188694 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188700 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188705 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188711 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188716 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188722 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188727 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188732 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188737 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188743 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:06:11.202112 master-2 kubenswrapper[4762]: W1014 13:06:11.188748 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:06:11.203148 master-2 kubenswrapper[4762]: W1014 13:06:11.188753 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:06:11.203148 master-2 kubenswrapper[4762]: W1014 13:06:11.188758 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:06:11.203148 master-2 kubenswrapper[4762]: W1014 13:06:11.188763 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:06:11.203148 master-2 kubenswrapper[4762]: W1014 13:06:11.188769 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:06:11.203148 master-2 kubenswrapper[4762]: I1014 13:06:11.188785 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:06:11.204897 master-2 kubenswrapper[4762]: I1014 13:06:11.204831 4762 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Oct 14 13:06:11.204897 master-2 kubenswrapper[4762]: I1014 13:06:11.204879 4762 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204959 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204969 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204974 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204978 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204982 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204986 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204990 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.204995 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205000 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205004 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205008 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205012 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205017 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205022 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205029 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205033 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205037 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205041 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:06:11.205038 master-2 kubenswrapper[4762]: W1014 13:06:11.205047 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205051 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205055 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205059 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205064 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205069 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205073 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205078 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205083 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205087 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205090 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205094 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205099 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205103 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205107 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205111 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205115 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205119 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205122 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205126 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:06:11.206030 master-2 kubenswrapper[4762]: W1014 13:06:11.205129 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205133 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205137 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205141 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205144 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205148 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205163 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205167 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205172 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205176 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205179 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205183 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205187 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205193 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205198 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205202 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205206 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205210 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205213 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205217 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:06:11.207324 master-2 kubenswrapper[4762]: W1014 13:06:11.205220 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205224 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205227 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205231 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205234 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205238 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205241 4762 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205245 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205248 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205251 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205255 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205258 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205262 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205265 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: I1014 13:06:11.205272 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205384 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 14 13:06:11.208608 master-2 kubenswrapper[4762]: W1014 13:06:11.205391 4762 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205396 4762 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205399 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205403 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205406 4762 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205410 4762 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205413 4762 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205417 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205420 4762 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205424 4762 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205428 4762 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205432 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205436 4762 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205440 4762 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205443 4762 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205447 4762 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205452 4762 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205457 4762 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205461 4762 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 14 13:06:11.209437 master-2 kubenswrapper[4762]: W1014 13:06:11.205465 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205469 4762 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205473 4762 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205478 4762 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205482 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205486 4762 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205490 4762 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205493 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205497 4762 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205500 4762 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205505 4762 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205509 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205513 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205516 4762 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205520 4762 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205524 4762 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205527 4762 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205530 4762 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205534 4762 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 14 13:06:11.210800 master-2 kubenswrapper[4762]: W1014 13:06:11.205537 4762 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205541 4762 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205544 4762 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205548 4762 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205552 4762 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205556 4762 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205559 4762 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205563 4762 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205566 4762 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205570 4762 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205575 4762 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205580 4762 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205584 4762 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205587 4762 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205591 4762 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205595 4762 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205598 4762 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205601 4762 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205605 4762 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205608 4762 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 14 13:06:11.212092 master-2 kubenswrapper[4762]: W1014 13:06:11.205612 4762 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205615 4762 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205619 4762 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205622 4762 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205626 4762 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205629 4762 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205634 4762 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205638 4762 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205641 4762 feature_gate.go:330] unrecognized feature gate: Example Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205645 4762 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205649 4762 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205652 4762 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: W1014 13:06:11.205656 4762 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: I1014 13:06:11.205661 4762 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: I1014 13:06:11.205825 4762 server.go:940] "Client rotation is on, will bootstrap in background" Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: I1014 13:06:11.209381 4762 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Oct 14 13:06:11.213356 master-2 kubenswrapper[4762]: I1014 13:06:11.211327 4762 server.go:997] "Starting client certificate rotation" Oct 14 13:06:11.214468 master-2 kubenswrapper[4762]: I1014 13:06:11.211345 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 14 13:06:11.214468 master-2 kubenswrapper[4762]: I1014 13:06:11.211670 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Oct 14 13:06:11.239141 master-2 kubenswrapper[4762]: I1014 13:06:11.239061 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:06:11.248899 master-2 kubenswrapper[4762]: I1014 13:06:11.248766 4762 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:06:11.271186 master-2 kubenswrapper[4762]: I1014 13:06:11.271089 4762 log.go:25] "Validated CRI v1 runtime API" Oct 14 13:06:11.273621 master-2 kubenswrapper[4762]: I1014 13:06:11.273557 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Oct 14 13:06:11.279358 master-2 kubenswrapper[4762]: I1014 13:06:11.279182 4762 log.go:25] "Validated CRI v1 image API" Oct 14 13:06:11.280878 master-2 kubenswrapper[4762]: I1014 13:06:11.280859 4762 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 14 13:06:11.292523 master-2 kubenswrapper[4762]: I1014 13:06:11.292465 4762 fs.go:135] Filesystem UUIDs: map[5196c3ac-2731-46b5-86e0-b62adcecb7ff:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Oct 14 13:06:11.292605 master-2 kubenswrapper[4762]: I1014 13:06:11.292518 4762 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Oct 14 13:06:11.350931 master-2 kubenswrapper[4762]: I1014 13:06:11.350486 4762 manager.go:217] Machine: {Timestamp:2025-10-14 13:06:11.325060057 +0000 UTC m=+0.569219296 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:fab6fb24f75f48138ae89582a4ef234b SystemUUID:fab6fb24-f75f-4813-8ae8-9582a4ef234b BootID:b87ad051-e486-4755-b84d-19fcde80c8c4 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2f:6f:3a Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:3e:2f:6f:3a Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:b8:da:c3 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:ac:81:74 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:12:95:41:84:51:61 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 14 13:06:11.350931 master-2 kubenswrapper[4762]: I1014 13:06:11.350874 4762 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 14 13:06:11.351225 master-2 kubenswrapper[4762]: I1014 13:06:11.351076 4762 manager.go:233] Version: {KernelVersion:5.14.0-427.91.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202509241235-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 14 13:06:11.352252 master-2 kubenswrapper[4762]: I1014 13:06:11.352148 4762 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 14 13:06:11.352680 master-2 kubenswrapper[4762]: I1014 13:06:11.352613 4762 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 14 13:06:11.353177 master-2 kubenswrapper[4762]: I1014 13:06:11.352701 4762 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-2","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 14 13:06:11.353269 master-2 kubenswrapper[4762]: I1014 13:06:11.353193 4762 topology_manager.go:138] "Creating topology manager with none policy" Oct 14 13:06:11.353269 master-2 kubenswrapper[4762]: I1014 13:06:11.353222 4762 container_manager_linux.go:303] "Creating device plugin manager" Oct 14 13:06:11.353359 master-2 kubenswrapper[4762]: I1014 13:06:11.353257 4762 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 14 13:06:11.353359 master-2 kubenswrapper[4762]: I1014 13:06:11.353329 4762 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 14 13:06:11.355677 master-2 kubenswrapper[4762]: I1014 13:06:11.355638 4762 state_mem.go:36] "Initialized new in-memory state store" Oct 14 13:06:11.357042 master-2 kubenswrapper[4762]: I1014 13:06:11.357003 4762 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 14 13:06:11.362175 master-2 kubenswrapper[4762]: I1014 13:06:11.362133 4762 kubelet.go:418] "Attempting to sync node with API server" Oct 14 13:06:11.362232 master-2 kubenswrapper[4762]: I1014 13:06:11.362179 4762 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 14 13:06:11.362269 master-2 kubenswrapper[4762]: I1014 13:06:11.362258 4762 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 14 13:06:11.362324 master-2 kubenswrapper[4762]: I1014 13:06:11.362275 4762 kubelet.go:324] "Adding apiserver pod source" Oct 14 13:06:11.362324 master-2 kubenswrapper[4762]: I1014 13:06:11.362305 4762 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 14 13:06:11.374096 master-2 kubenswrapper[4762]: I1014 13:06:11.374032 4762 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.12-3.rhaos4.18.gitdc59c78.el9" apiVersion="v1" Oct 14 13:06:11.378647 master-2 kubenswrapper[4762]: I1014 13:06:11.378613 4762 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 14 13:06:11.378856 master-2 kubenswrapper[4762]: I1014 13:06:11.378830 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 14 13:06:11.378856 master-2 kubenswrapper[4762]: I1014 13:06:11.378856 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378868 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378877 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378890 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378897 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378904 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378915 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378924 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378931 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 14 13:06:11.378995 master-2 kubenswrapper[4762]: I1014 13:06:11.378941 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 14 13:06:11.379618 master-2 kubenswrapper[4762]: I1014 13:06:11.379476 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 14 13:06:11.382259 master-2 kubenswrapper[4762]: I1014 13:06:11.382196 4762 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 14 13:06:11.382601 master-2 kubenswrapper[4762]: W1014 13:06:11.382546 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 14 13:06:11.382759 master-2 kubenswrapper[4762]: W1014 13:06:11.382552 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-2" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 14 13:06:11.382759 master-2 kubenswrapper[4762]: I1014 13:06:11.382749 4762 server.go:1280] "Started kubelet" Oct 14 13:06:11.382872 master-2 kubenswrapper[4762]: E1014 13:06:11.382773 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-2\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:11.382872 master-2 kubenswrapper[4762]: E1014 13:06:11.382686 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:11.383815 master-2 kubenswrapper[4762]: I1014 13:06:11.383698 4762 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 14 13:06:11.383924 master-2 kubenswrapper[4762]: I1014 13:06:11.383831 4762 server_v1.go:47] "podresources" method="list" useActivePods=true Oct 14 13:06:11.384567 master-2 systemd[1]: Started Kubernetes Kubelet. Oct 14 13:06:11.385120 master-2 kubenswrapper[4762]: I1014 13:06:11.385082 4762 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 14 13:06:11.385255 master-2 kubenswrapper[4762]: I1014 13:06:11.385057 4762 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 14 13:06:11.387079 master-2 kubenswrapper[4762]: I1014 13:06:11.387030 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 14 13:06:11.387148 master-2 kubenswrapper[4762]: I1014 13:06:11.387099 4762 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 14 13:06:11.387872 master-2 kubenswrapper[4762]: E1014 13:06:11.387725 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:11.388064 master-2 kubenswrapper[4762]: I1014 13:06:11.387928 4762 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 14 13:06:11.388064 master-2 kubenswrapper[4762]: I1014 13:06:11.387947 4762 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 14 13:06:11.388064 master-2 kubenswrapper[4762]: I1014 13:06:11.387931 4762 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Oct 14 13:06:11.388385 master-2 kubenswrapper[4762]: I1014 13:06:11.388090 4762 reconstruct.go:97] "Volume reconstruction finished" Oct 14 13:06:11.388385 master-2 kubenswrapper[4762]: I1014 13:06:11.388100 4762 reconciler.go:26] "Reconciler: start to sync state" Oct 14 13:06:11.390655 master-2 kubenswrapper[4762]: I1014 13:06:11.390613 4762 server.go:449] "Adding debug handlers to kubelet server" Oct 14 13:06:11.390807 master-2 kubenswrapper[4762]: I1014 13:06:11.390759 4762 factory.go:55] Registering systemd factory Oct 14 13:06:11.390807 master-2 kubenswrapper[4762]: I1014 13:06:11.390800 4762 factory.go:221] Registration of the systemd container factory successfully Oct 14 13:06:11.391887 master-2 kubenswrapper[4762]: I1014 13:06:11.391830 4762 factory.go:153] Registering CRI-O factory Oct 14 13:06:11.391887 master-2 kubenswrapper[4762]: I1014 13:06:11.391881 4762 factory.go:221] Registration of the crio container factory successfully Oct 14 13:06:11.392077 master-2 kubenswrapper[4762]: I1014 13:06:11.392034 4762 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 14 13:06:11.392143 master-2 kubenswrapper[4762]: I1014 13:06:11.392099 4762 factory.go:103] Registering Raw factory Oct 14 13:06:11.392143 master-2 kubenswrapper[4762]: I1014 13:06:11.392139 4762 manager.go:1196] Started watching for new ooms in manager Oct 14 13:06:11.394011 master-2 kubenswrapper[4762]: I1014 13:06:11.393956 4762 manager.go:319] Starting recovery of all containers Oct 14 13:06:11.395474 master-2 kubenswrapper[4762]: W1014 13:06:11.395435 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:11.395558 master-2 kubenswrapper[4762]: E1014 13:06:11.395477 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:11.395558 master-2 kubenswrapper[4762]: I1014 13:06:11.395513 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:11.395997 master-2 kubenswrapper[4762]: E1014 13:06:11.395873 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Oct 14 13:06:11.396668 master-2 kubenswrapper[4762]: E1014 13:06:11.395533 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d60081b7a3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.3827047 +0000 UTC m=+0.626863849,LastTimestamp:2025-10-14 13:06:11.3827047 +0000 UTC m=+0.626863849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.398268 master-2 kubenswrapper[4762]: E1014 13:06:11.398220 4762 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Oct 14 13:06:11.427939 master-2 kubenswrapper[4762]: I1014 13:06:11.427897 4762 manager.go:324] Recovery completed Oct 14 13:06:11.443411 master-2 kubenswrapper[4762]: I1014 13:06:11.443330 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:11.445359 master-2 kubenswrapper[4762]: I1014 13:06:11.445290 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:11.445444 master-2 kubenswrapper[4762]: I1014 13:06:11.445370 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:11.445444 master-2 kubenswrapper[4762]: I1014 13:06:11.445382 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:11.446528 master-2 kubenswrapper[4762]: I1014 13:06:11.446488 4762 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 14 13:06:11.446528 master-2 kubenswrapper[4762]: I1014 13:06:11.446525 4762 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 14 13:06:11.446677 master-2 kubenswrapper[4762]: I1014 13:06:11.446568 4762 state_mem.go:36] "Initialized new in-memory state store" Oct 14 13:06:11.448469 master-2 kubenswrapper[4762]: E1014 13:06:11.448371 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.449653 master-2 kubenswrapper[4762]: I1014 13:06:11.449617 4762 policy_none.go:49] "None policy: Start" Oct 14 13:06:11.450916 master-2 kubenswrapper[4762]: I1014 13:06:11.450882 4762 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 14 13:06:11.451036 master-2 kubenswrapper[4762]: I1014 13:06:11.451001 4762 state_mem.go:35] "Initializing new in-memory state store" Oct 14 13:06:11.456981 master-2 kubenswrapper[4762]: E1014 13:06:11.456912 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.464851 master-2 kubenswrapper[4762]: E1014 13:06:11.464717 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.488393 master-2 kubenswrapper[4762]: E1014 13:06:11.488319 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:11.519063 master-2 kubenswrapper[4762]: I1014 13:06:11.519009 4762 manager.go:334] "Starting Device Plugin manager" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.519091 4762 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.519104 4762 server.go:79] "Starting device plugin registration server" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.519755 4762 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.519768 4762 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.519959 4762 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.520069 4762 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: I1014 13:06:11.520079 4762 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: E1014 13:06:11.521760 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-2\" not found" Oct 14 13:06:11.542875 master-2 kubenswrapper[4762]: E1014 13:06:11.533470 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d60107e3502 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.52339277 +0000 UTC m=+0.767551969,LastTimestamp:2025-10-14 13:06:11.52339277 +0000 UTC m=+0.767551969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.544249 master-2 kubenswrapper[4762]: I1014 13:06:11.544205 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 14 13:06:11.547064 master-2 kubenswrapper[4762]: I1014 13:06:11.547027 4762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 14 13:06:11.547145 master-2 kubenswrapper[4762]: I1014 13:06:11.547072 4762 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 14 13:06:11.547145 master-2 kubenswrapper[4762]: I1014 13:06:11.547109 4762 kubelet.go:2335] "Starting kubelet main sync loop" Oct 14 13:06:11.547568 master-2 kubenswrapper[4762]: E1014 13:06:11.547267 4762 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Oct 14 13:06:11.559792 master-2 kubenswrapper[4762]: W1014 13:06:11.559707 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 14 13:06:11.559792 master-2 kubenswrapper[4762]: E1014 13:06:11.559768 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:11.607675 master-2 kubenswrapper[4762]: E1014 13:06:11.607591 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Oct 14 13:06:11.620799 master-2 kubenswrapper[4762]: I1014 13:06:11.620719 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:11.622104 master-2 kubenswrapper[4762]: I1014 13:06:11.622028 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:11.622104 master-2 kubenswrapper[4762]: I1014 13:06:11.622082 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:11.622104 master-2 kubenswrapper[4762]: I1014 13:06:11.622094 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:11.622104 master-2 kubenswrapper[4762]: I1014 13:06:11.622136 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:11.632755 master-2 kubenswrapper[4762]: E1014 13:06:11.632667 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:11.633081 master-2 kubenswrapper[4762]: E1014 13:06:11.632876 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:11.622063995 +0000 UTC m=+0.866223174,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.643542 master-2 kubenswrapper[4762]: E1014 13:06:11.643340 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:11.622090026 +0000 UTC m=+0.866249195,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.648524 master-2 kubenswrapper[4762]: I1014 13:06:11.648453 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-2"] Oct 14 13:06:11.648656 master-2 kubenswrapper[4762]: I1014 13:06:11.648575 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:11.649801 master-2 kubenswrapper[4762]: I1014 13:06:11.649748 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:11.649801 master-2 kubenswrapper[4762]: I1014 13:06:11.649807 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:11.649997 master-2 kubenswrapper[4762]: I1014 13:06:11.649820 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:11.650105 master-2 kubenswrapper[4762]: I1014 13:06:11.650077 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.650200 master-2 kubenswrapper[4762]: I1014 13:06:11.650111 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:11.650932 master-2 kubenswrapper[4762]: I1014 13:06:11.650896 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:11.651036 master-2 kubenswrapper[4762]: I1014 13:06:11.650948 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:11.651036 master-2 kubenswrapper[4762]: I1014 13:06:11.651001 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:11.653588 master-2 kubenswrapper[4762]: E1014 13:06:11.653445 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:11.622102066 +0000 UTC m=+0.866261235,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.664480 master-2 kubenswrapper[4762]: E1014 13:06:11.663134 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:11.649771088 +0000 UTC m=+0.893930247,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.676384 master-2 kubenswrapper[4762]: E1014 13:06:11.676147 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:11.649815849 +0000 UTC m=+0.893975008,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.687067 master-2 kubenswrapper[4762]: E1014 13:06:11.686860 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:11.64982736 +0000 UTC m=+0.893986519,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.697044 master-2 kubenswrapper[4762]: E1014 13:06:11.696760 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:11.650928658 +0000 UTC m=+0.895087857,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.706004 master-2 kubenswrapper[4762]: E1014 13:06:11.705865 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:11.650970609 +0000 UTC m=+0.895129808,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.715101 master-2 kubenswrapper[4762]: E1014 13:06:11.714985 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:11.651024681 +0000 UTC m=+0.895183880,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.789097 master-2 kubenswrapper[4762]: I1014 13:06:11.788932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-etc-kube\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.789097 master-2 kubenswrapper[4762]: I1014 13:06:11.789006 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.833364 master-2 kubenswrapper[4762]: I1014 13:06:11.833239 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:11.834929 master-2 kubenswrapper[4762]: I1014 13:06:11.834873 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:11.835073 master-2 kubenswrapper[4762]: I1014 13:06:11.834951 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:11.835073 master-2 kubenswrapper[4762]: I1014 13:06:11.834976 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:11.835073 master-2 kubenswrapper[4762]: I1014 13:06:11.835043 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:11.845765 master-2 kubenswrapper[4762]: E1014 13:06:11.845706 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:11.846366 master-2 kubenswrapper[4762]: E1014 13:06:11.846119 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:11.834928597 +0000 UTC m=+1.079087796,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.857056 master-2 kubenswrapper[4762]: E1014 13:06:11.856866 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:11.834965118 +0000 UTC m=+1.079124317,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.866642 master-2 kubenswrapper[4762]: E1014 13:06:11.866495 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:11.834986569 +0000 UTC m=+1.079145768,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:11.889332 master-2 kubenswrapper[4762]: I1014 13:06:11.889258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-etc-kube\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.889497 master-2 kubenswrapper[4762]: I1014 13:06:11.889330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.889497 master-2 kubenswrapper[4762]: I1014 13:06:11.889464 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-etc-kube\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.889619 master-2 kubenswrapper[4762]: I1014 13:06:11.889509 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f022eff2d978fee6b366ac18a80aa53c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-2\" (UID: \"f022eff2d978fee6b366ac18a80aa53c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:11.985763 master-2 kubenswrapper[4762]: I1014 13:06:11.985579 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" Oct 14 13:06:12.019962 master-2 kubenswrapper[4762]: E1014 13:06:12.019901 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Oct 14 13:06:12.246022 master-2 kubenswrapper[4762]: I1014 13:06:12.245805 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:12.249240 master-2 kubenswrapper[4762]: I1014 13:06:12.249115 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:12.249240 master-2 kubenswrapper[4762]: I1014 13:06:12.249220 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:12.249240 master-2 kubenswrapper[4762]: I1014 13:06:12.249242 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:12.249699 master-2 kubenswrapper[4762]: I1014 13:06:12.249291 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:12.261850 master-2 kubenswrapper[4762]: W1014 13:06:12.261771 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-2" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 14 13:06:12.261850 master-2 kubenswrapper[4762]: E1014 13:06:12.261829 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-2\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:12.262329 master-2 kubenswrapper[4762]: E1014 13:06:12.261856 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:12.262329 master-2 kubenswrapper[4762]: E1014 13:06:12.261891 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:12.249194863 +0000 UTC m=+1.493354062,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:12.271528 master-2 kubenswrapper[4762]: E1014 13:06:12.271359 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:12.249232274 +0000 UTC m=+1.493391463,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:12.280429 master-2 kubenswrapper[4762]: E1014 13:06:12.280297 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:12.249252265 +0000 UTC m=+1.493411454,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:12.407097 master-2 kubenswrapper[4762]: I1014 13:06:12.406927 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:12.456609 master-2 kubenswrapper[4762]: W1014 13:06:12.456513 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 14 13:06:12.456609 master-2 kubenswrapper[4762]: E1014 13:06:12.456588 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:12.598439 master-2 kubenswrapper[4762]: W1014 13:06:12.598256 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:12.598439 master-2 kubenswrapper[4762]: E1014 13:06:12.598326 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:12.816857 master-2 kubenswrapper[4762]: W1014 13:06:12.816417 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf022eff2d978fee6b366ac18a80aa53c.slice/crio-c87b310e66429cfb670045a30079658186b694b2934643ce2fd6f00ef141c37c WatchSource:0}: Error finding container c87b310e66429cfb670045a30079658186b694b2934643ce2fd6f00ef141c37c: Status 404 returned error can't find the container with id c87b310e66429cfb670045a30079658186b694b2934643ce2fd6f00ef141c37c Oct 14 13:06:12.823975 master-2 kubenswrapper[4762]: I1014 13:06:12.823703 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:06:12.830523 master-2 kubenswrapper[4762]: E1014 13:06:12.830459 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Oct 14 13:06:12.830668 master-2 kubenswrapper[4762]: E1014 13:06:12.830546 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d605dfdf8c8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f22b65e5c744a32d3955dd7c36d809e3114a8aa501b44c00330dfda886c21169\",Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:12.823611592 +0000 UTC m=+2.067770791,LastTimestamp:2025-10-14 13:06:12.823611592 +0000 UTC m=+2.067770791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:12.880400 master-2 kubenswrapper[4762]: W1014 13:06:12.880133 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 14 13:06:12.880400 master-2 kubenswrapper[4762]: E1014 13:06:12.880248 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:13.062942 master-2 kubenswrapper[4762]: I1014 13:06:13.062511 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:13.064010 master-2 kubenswrapper[4762]: I1014 13:06:13.063949 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:13.064010 master-2 kubenswrapper[4762]: I1014 13:06:13.064000 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:13.064239 master-2 kubenswrapper[4762]: I1014 13:06:13.064018 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:13.064239 master-2 kubenswrapper[4762]: I1014 13:06:13.064064 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:13.075371 master-2 kubenswrapper[4762]: E1014 13:06:13.075219 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:13.06398194 +0000 UTC m=+2.308141129,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:13.075503 master-2 kubenswrapper[4762]: E1014 13:06:13.075412 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:13.084919 master-2 kubenswrapper[4762]: E1014 13:06:13.084773 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:13.064011691 +0000 UTC m=+2.308170880,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:13.095362 master-2 kubenswrapper[4762]: E1014 13:06:13.095239 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7fcba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7fcba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-2 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445390522 +0000 UTC m=+0.689549701,LastTimestamp:2025-10-14 13:06:13.064028661 +0000 UTC m=+2.308187850,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:13.407089 master-2 kubenswrapper[4762]: I1014 13:06:13.407023 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:13.554602 master-2 kubenswrapper[4762]: I1014 13:06:13.554439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerStarted","Data":"c87b310e66429cfb670045a30079658186b694b2934643ce2fd6f00ef141c37c"} Oct 14 13:06:14.406318 master-2 kubenswrapper[4762]: I1014 13:06:14.406269 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:14.441192 master-2 kubenswrapper[4762]: W1014 13:06:14.441145 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 14 13:06:14.441487 master-2 kubenswrapper[4762]: E1014 13:06:14.441210 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:14.441487 master-2 kubenswrapper[4762]: E1014 13:06:14.441141 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Oct 14 13:06:14.450394 master-2 kubenswrapper[4762]: E1014 13:06:14.450299 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d60be5ee16f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f22b65e5c744a32d3955dd7c36d809e3114a8aa501b44c00330dfda886c21169\" in 1.616s (1.616s including waiting). Image size: 458126368 bytes.,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:14.440575343 +0000 UTC m=+3.684734532,LastTimestamp:2025-10-14 13:06:14.440575343 +0000 UTC m=+3.684734532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:14.676321 master-2 kubenswrapper[4762]: I1014 13:06:14.676239 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:14.678521 master-2 kubenswrapper[4762]: I1014 13:06:14.678454 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:14.678521 master-2 kubenswrapper[4762]: I1014 13:06:14.678512 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:14.678521 master-2 kubenswrapper[4762]: I1014 13:06:14.678530 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:14.678926 master-2 kubenswrapper[4762]: I1014 13:06:14.678586 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:14.687813 master-2 kubenswrapper[4762]: E1014 13:06:14.687743 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:14.688538 master-2 kubenswrapper[4762]: E1014 13:06:14.688415 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd76b40\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd76b40 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-2 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.44535328 +0000 UTC m=+0.689512469,LastTimestamp:2025-10-14 13:06:14.678491946 +0000 UTC m=+3.922651145,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:14.698657 master-2 kubenswrapper[4762]: E1014 13:06:14.698505 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"master-2.186e5d600bd7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-2.186e5d600bd7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-2,UID:master-2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-2 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:11.445377901 +0000 UTC m=+0.689537080,LastTimestamp:2025-10-14 13:06:14.678523737 +0000 UTC m=+3.922682936,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:14.708283 master-2 kubenswrapper[4762]: E1014 13:06:14.708020 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d60cd50ac05 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:14.691302405 +0000 UTC m=+3.935461594,LastTimestamp:2025-10-14 13:06:14.691302405 +0000 UTC m=+3.935461594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:14.721046 master-2 kubenswrapper[4762]: E1014 13:06:14.720866 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d60ce73cb7c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:14.710381436 +0000 UTC m=+3.954540625,LastTimestamp:2025-10-14 13:06:14.710381436 +0000 UTC m=+3.954540625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:14.987918 master-2 kubenswrapper[4762]: W1014 13:06:14.987726 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:14.987918 master-2 kubenswrapper[4762]: E1014 13:06:14.987808 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:15.323736 master-2 kubenswrapper[4762]: W1014 13:06:15.323544 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-2" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 14 13:06:15.323736 master-2 kubenswrapper[4762]: E1014 13:06:15.323605 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-2\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:15.406983 master-2 kubenswrapper[4762]: I1014 13:06:15.406899 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:15.406983 master-2 kubenswrapper[4762]: W1014 13:06:15.406925 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Oct 14 13:06:15.407337 master-2 kubenswrapper[4762]: E1014 13:06:15.407012 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:15.560868 master-2 kubenswrapper[4762]: I1014 13:06:15.560765 4762 generic.go:334] "Generic (PLEG): container finished" podID="f022eff2d978fee6b366ac18a80aa53c" containerID="8c7b84cc6b70fe3b8aac7d625cab179d812c21968a9b55f5305f78245385bbf6" exitCode=0 Oct 14 13:06:15.560868 master-2 kubenswrapper[4762]: I1014 13:06:15.560814 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerDied","Data":"8c7b84cc6b70fe3b8aac7d625cab179d812c21968a9b55f5305f78245385bbf6"} Oct 14 13:06:15.561790 master-2 kubenswrapper[4762]: I1014 13:06:15.560902 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:15.561863 master-2 kubenswrapper[4762]: I1014 13:06:15.561818 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:15.561932 master-2 kubenswrapper[4762]: I1014 13:06:15.561872 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:15.561932 master-2 kubenswrapper[4762]: I1014 13:06:15.561916 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:15.575774 master-2 kubenswrapper[4762]: E1014 13:06:15.575529 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d6101db480b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f22b65e5c744a32d3955dd7c36d809e3114a8aa501b44c00330dfda886c21169\" already present on machine,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:15.572801547 +0000 UTC m=+4.816960706,LastTimestamp:2025-10-14 13:06:15.572801547 +0000 UTC m=+4.816960706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:16.157191 master-2 kubenswrapper[4762]: E1014 13:06:16.157030 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d6124147169 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:16.146973033 +0000 UTC m=+5.391132242,LastTimestamp:2025-10-14 13:06:16.146973033 +0000 UTC m=+5.391132242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:16.295596 master-2 kubenswrapper[4762]: E1014 13:06:16.295440 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d612c400b1a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:16.284048154 +0000 UTC m=+5.528207313,LastTimestamp:2025-10-14 13:06:16.284048154 +0000 UTC m=+5.528207313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:16.407287 master-2 kubenswrapper[4762]: I1014 13:06:16.407127 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:16.565010 master-2 kubenswrapper[4762]: I1014 13:06:16.564901 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/0.log" Oct 14 13:06:16.566050 master-2 kubenswrapper[4762]: I1014 13:06:16.565757 4762 generic.go:334] "Generic (PLEG): container finished" podID="f022eff2d978fee6b366ac18a80aa53c" containerID="a2ac8f9fd28eed3c0911f0b0294041f22ca5e90dff9fa0766f75ab8ad408ece1" exitCode=1 Oct 14 13:06:16.566050 master-2 kubenswrapper[4762]: I1014 13:06:16.565793 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerDied","Data":"a2ac8f9fd28eed3c0911f0b0294041f22ca5e90dff9fa0766f75ab8ad408ece1"} Oct 14 13:06:16.566050 master-2 kubenswrapper[4762]: I1014 13:06:16.565901 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:16.566919 master-2 kubenswrapper[4762]: I1014 13:06:16.566873 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:16.566919 master-2 kubenswrapper[4762]: I1014 13:06:16.566907 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:16.566919 master-2 kubenswrapper[4762]: I1014 13:06:16.566918 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:16.578186 master-2 kubenswrapper[4762]: I1014 13:06:16.578071 4762 scope.go:117] "RemoveContainer" containerID="a2ac8f9fd28eed3c0911f0b0294041f22ca5e90dff9fa0766f75ab8ad408ece1" Oct 14 13:06:16.593581 master-2 kubenswrapper[4762]: E1014 13:06:16.593408 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-2.186e5d6101db480b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d6101db480b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f22b65e5c744a32d3955dd7c36d809e3114a8aa501b44c00330dfda886c21169\" already present on machine,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:15.572801547 +0000 UTC m=+4.816960706,LastTimestamp:2025-10-14 13:06:16.58332315 +0000 UTC m=+5.827482349,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:16.841367 master-2 kubenswrapper[4762]: E1014 13:06:16.841122 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-2.186e5d6124147169\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d6124147169 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:16.146973033 +0000 UTC m=+5.391132242,LastTimestamp:2025-10-14 13:06:16.831104314 +0000 UTC m=+6.075263513,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:16.856219 master-2 kubenswrapper[4762]: E1014 13:06:16.856033 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-2.186e5d612c400b1a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d612c400b1a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:16.284048154 +0000 UTC m=+5.528207313,LastTimestamp:2025-10-14 13:06:16.845029695 +0000 UTC m=+6.089188854,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:17.406455 master-2 kubenswrapper[4762]: I1014 13:06:17.406368 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:17.569055 master-2 kubenswrapper[4762]: I1014 13:06:17.568968 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/1.log" Oct 14 13:06:17.569779 master-2 kubenswrapper[4762]: I1014 13:06:17.569389 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/0.log" Oct 14 13:06:17.569779 master-2 kubenswrapper[4762]: I1014 13:06:17.569715 4762 generic.go:334] "Generic (PLEG): container finished" podID="f022eff2d978fee6b366ac18a80aa53c" containerID="d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062" exitCode=1 Oct 14 13:06:17.569779 master-2 kubenswrapper[4762]: I1014 13:06:17.569746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerDied","Data":"d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062"} Oct 14 13:06:17.569779 master-2 kubenswrapper[4762]: I1014 13:06:17.569780 4762 scope.go:117] "RemoveContainer" containerID="a2ac8f9fd28eed3c0911f0b0294041f22ca5e90dff9fa0766f75ab8ad408ece1" Oct 14 13:06:17.570065 master-2 kubenswrapper[4762]: I1014 13:06:17.569903 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:17.571261 master-2 kubenswrapper[4762]: I1014 13:06:17.571209 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:17.571261 master-2 kubenswrapper[4762]: I1014 13:06:17.571249 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:17.571261 master-2 kubenswrapper[4762]: I1014 13:06:17.571264 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:17.588856 master-2 kubenswrapper[4762]: I1014 13:06:17.588815 4762 scope.go:117] "RemoveContainer" containerID="d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062" Oct 14 13:06:17.589055 master-2 kubenswrapper[4762]: E1014 13:06:17.588998 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podUID="f022eff2d978fee6b366ac18a80aa53c" Oct 14 13:06:17.599725 master-2 kubenswrapper[4762]: E1014 13:06:17.599076 4762 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d617a075928 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c),Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:17.588955432 +0000 UTC m=+6.833114601,LastTimestamp:2025-10-14 13:06:17.588955432 +0000 UTC m=+6.833114601,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:17.651325 master-2 kubenswrapper[4762]: E1014 13:06:17.651225 4762 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-2\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Oct 14 13:06:17.888338 master-2 kubenswrapper[4762]: I1014 13:06:17.888220 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:17.891704 master-2 kubenswrapper[4762]: I1014 13:06:17.891634 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:17.891704 master-2 kubenswrapper[4762]: I1014 13:06:17.891695 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:17.891924 master-2 kubenswrapper[4762]: I1014 13:06:17.891716 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:17.891924 master-2 kubenswrapper[4762]: I1014 13:06:17.891778 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:17.902643 master-2 kubenswrapper[4762]: E1014 13:06:17.902587 4762 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-2" Oct 14 13:06:18.406891 master-2 kubenswrapper[4762]: I1014 13:06:18.406793 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:18.575526 master-2 kubenswrapper[4762]: I1014 13:06:18.575411 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/1.log" Oct 14 13:06:18.576462 master-2 kubenswrapper[4762]: I1014 13:06:18.576242 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:18.577527 master-2 kubenswrapper[4762]: I1014 13:06:18.577481 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:18.577636 master-2 kubenswrapper[4762]: I1014 13:06:18.577543 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:18.577636 master-2 kubenswrapper[4762]: I1014 13:06:18.577562 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:18.578085 master-2 kubenswrapper[4762]: I1014 13:06:18.578031 4762 scope.go:117] "RemoveContainer" containerID="d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062" Oct 14 13:06:18.578360 master-2 kubenswrapper[4762]: E1014 13:06:18.578309 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podUID="f022eff2d978fee6b366ac18a80aa53c" Oct 14 13:06:18.589422 master-2 kubenswrapper[4762]: E1014 13:06:18.589251 4762 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-2.186e5d617a075928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-2.186e5d617a075928 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-2,UID:f022eff2d978fee6b366ac18a80aa53c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c),Source:EventSource{Component:kubelet,Host:master-2,},FirstTimestamp:2025-10-14 13:06:17.588955432 +0000 UTC m=+6.833114601,LastTimestamp:2025-10-14 13:06:18.57824667 +0000 UTC m=+7.822405869,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-2,}" Oct 14 13:06:19.407790 master-2 kubenswrapper[4762]: I1014 13:06:19.406762 4762 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-2" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:19.601674 master-2 kubenswrapper[4762]: W1014 13:06:19.601549 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Oct 14 13:06:19.601674 master-2 kubenswrapper[4762]: E1014 13:06:19.601656 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:19.652061 master-2 kubenswrapper[4762]: W1014 13:06:19.651940 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Oct 14 13:06:19.652061 master-2 kubenswrapper[4762]: E1014 13:06:19.652025 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:19.779686 master-2 kubenswrapper[4762]: W1014 13:06:19.779574 4762 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-2" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Oct 14 13:06:19.779686 master-2 kubenswrapper[4762]: E1014 13:06:19.779647 4762 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-2\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Oct 14 13:06:19.992469 master-2 kubenswrapper[4762]: I1014 13:06:19.992266 4762 csr.go:261] certificate signing request csr-xpdgs is approved, waiting to be issued Oct 14 13:06:20.003769 master-2 kubenswrapper[4762]: I1014 13:06:20.003689 4762 csr.go:257] certificate signing request csr-xpdgs is issued Oct 14 13:06:20.210619 master-2 kubenswrapper[4762]: I1014 13:06:20.210513 4762 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Oct 14 13:06:20.418503 master-2 kubenswrapper[4762]: I1014 13:06:20.418425 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.435745 master-2 kubenswrapper[4762]: I1014 13:06:20.435664 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.492795 master-2 kubenswrapper[4762]: I1014 13:06:20.492684 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.757622 master-2 kubenswrapper[4762]: I1014 13:06:20.757400 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.757622 master-2 kubenswrapper[4762]: E1014 13:06:20.757458 4762 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-2" not found Oct 14 13:06:20.781457 master-2 kubenswrapper[4762]: I1014 13:06:20.781344 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.798666 master-2 kubenswrapper[4762]: I1014 13:06:20.798558 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:20.856370 master-2 kubenswrapper[4762]: I1014 13:06:20.856274 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.006200 master-2 kubenswrapper[4762]: I1014 13:06:21.005946 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-10-15 13:01:17 +0000 UTC, rotation deadline is 2025-10-15 09:02:43.960019586 +0000 UTC Oct 14 13:06:21.006200 master-2 kubenswrapper[4762]: I1014 13:06:21.006088 4762 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h56m22.953938022s for next certificate rotation Oct 14 13:06:21.129580 master-2 kubenswrapper[4762]: I1014 13:06:21.129437 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.129580 master-2 kubenswrapper[4762]: E1014 13:06:21.129505 4762 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-2" not found Oct 14 13:06:21.231840 master-2 kubenswrapper[4762]: I1014 13:06:21.231750 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.248623 master-2 kubenswrapper[4762]: I1014 13:06:21.248522 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 14 13:06:21.249119 master-2 kubenswrapper[4762]: I1014 13:06:21.249060 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.308381 master-2 kubenswrapper[4762]: I1014 13:06:21.308271 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.522333 master-2 kubenswrapper[4762]: E1014 13:06:21.522200 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-2\" not found" Oct 14 13:06:21.575458 master-2 kubenswrapper[4762]: I1014 13:06:21.575359 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:21.575458 master-2 kubenswrapper[4762]: E1014 13:06:21.575418 4762 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-2" not found Oct 14 13:06:22.168274 master-2 kubenswrapper[4762]: I1014 13:06:22.168218 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:22.185084 master-2 kubenswrapper[4762]: I1014 13:06:22.185024 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:22.243833 master-2 kubenswrapper[4762]: I1014 13:06:22.243746 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:22.507544 master-2 kubenswrapper[4762]: I1014 13:06:22.507423 4762 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-2" not found Oct 14 13:06:22.507544 master-2 kubenswrapper[4762]: E1014 13:06:22.507457 4762 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-2" not found Oct 14 13:06:24.060879 master-2 kubenswrapper[4762]: E1014 13:06:24.060789 4762 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-2\" not found" node="master-2" Oct 14 13:06:24.303884 master-2 kubenswrapper[4762]: I1014 13:06:24.303565 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:24.305568 master-2 kubenswrapper[4762]: I1014 13:06:24.305524 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:24.305643 master-2 kubenswrapper[4762]: I1014 13:06:24.305572 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:24.305643 master-2 kubenswrapper[4762]: I1014 13:06:24.305589 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:24.305643 master-2 kubenswrapper[4762]: I1014 13:06:24.305632 4762 kubelet_node_status.go:76] "Attempting to register node" node="master-2" Oct 14 13:06:24.314396 master-2 kubenswrapper[4762]: I1014 13:06:24.314301 4762 kubelet_node_status.go:79] "Successfully registered node" node="master-2" Oct 14 13:06:24.314396 master-2 kubenswrapper[4762]: E1014 13:06:24.314336 4762 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-2\": node \"master-2\" not found" Oct 14 13:06:24.328560 master-2 kubenswrapper[4762]: E1014 13:06:24.328507 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.395892 master-2 kubenswrapper[4762]: I1014 13:06:24.395813 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Oct 14 13:06:24.409305 master-2 kubenswrapper[4762]: I1014 13:06:24.409231 4762 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Oct 14 13:06:24.429507 master-2 kubenswrapper[4762]: E1014 13:06:24.429377 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.530027 master-2 kubenswrapper[4762]: E1014 13:06:24.529896 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.630477 master-2 kubenswrapper[4762]: E1014 13:06:24.630244 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.731117 master-2 kubenswrapper[4762]: E1014 13:06:24.731048 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.832251 master-2 kubenswrapper[4762]: E1014 13:06:24.832149 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:24.933235 master-2 kubenswrapper[4762]: E1014 13:06:24.933117 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.034192 master-2 kubenswrapper[4762]: E1014 13:06:25.034090 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.135392 master-2 kubenswrapper[4762]: E1014 13:06:25.135264 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.237372 master-2 kubenswrapper[4762]: E1014 13:06:25.237121 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.337982 master-2 kubenswrapper[4762]: E1014 13:06:25.337869 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.439208 master-2 kubenswrapper[4762]: E1014 13:06:25.439051 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.539982 master-2 kubenswrapper[4762]: E1014 13:06:25.539817 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.640821 master-2 kubenswrapper[4762]: E1014 13:06:25.640694 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.741190 master-2 kubenswrapper[4762]: E1014 13:06:25.741019 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.841962 master-2 kubenswrapper[4762]: E1014 13:06:25.841779 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:25.942596 master-2 kubenswrapper[4762]: E1014 13:06:25.942250 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.043535 master-2 kubenswrapper[4762]: E1014 13:06:26.043430 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.144535 master-2 kubenswrapper[4762]: E1014 13:06:26.144383 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.244933 master-2 kubenswrapper[4762]: E1014 13:06:26.244839 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.345903 master-2 kubenswrapper[4762]: E1014 13:06:26.345774 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.358573 master-2 kubenswrapper[4762]: I1014 13:06:26.358404 4762 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 14 13:06:26.446511 master-2 kubenswrapper[4762]: E1014 13:06:26.446429 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.547550 master-2 kubenswrapper[4762]: E1014 13:06:26.547425 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.648424 master-2 kubenswrapper[4762]: E1014 13:06:26.648345 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.749377 master-2 kubenswrapper[4762]: E1014 13:06:26.749206 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.849493 master-2 kubenswrapper[4762]: E1014 13:06:26.849395 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:26.871646 master-2 kubenswrapper[4762]: I1014 13:06:26.871551 4762 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 14 13:06:26.949922 master-2 kubenswrapper[4762]: E1014 13:06:26.949779 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.050732 master-2 kubenswrapper[4762]: E1014 13:06:27.050545 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.151882 master-2 kubenswrapper[4762]: E1014 13:06:27.151753 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.252803 master-2 kubenswrapper[4762]: E1014 13:06:27.252686 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.353034 master-2 kubenswrapper[4762]: E1014 13:06:27.352850 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.454764 master-2 kubenswrapper[4762]: E1014 13:06:27.454405 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.555740 master-2 kubenswrapper[4762]: E1014 13:06:27.555398 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.656846 master-2 kubenswrapper[4762]: E1014 13:06:27.656639 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.757208 master-2 kubenswrapper[4762]: E1014 13:06:27.757056 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.857791 master-2 kubenswrapper[4762]: E1014 13:06:27.857588 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:27.958957 master-2 kubenswrapper[4762]: E1014 13:06:27.958828 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.059731 master-2 kubenswrapper[4762]: E1014 13:06:28.059599 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.160781 master-2 kubenswrapper[4762]: E1014 13:06:28.160641 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.261334 master-2 kubenswrapper[4762]: E1014 13:06:28.261047 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.362313 master-2 kubenswrapper[4762]: E1014 13:06:28.362125 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.463131 master-2 kubenswrapper[4762]: E1014 13:06:28.462970 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.563623 master-2 kubenswrapper[4762]: E1014 13:06:28.563433 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.664615 master-2 kubenswrapper[4762]: E1014 13:06:28.664494 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.765195 master-2 kubenswrapper[4762]: E1014 13:06:28.765049 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.866251 master-2 kubenswrapper[4762]: E1014 13:06:28.865989 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:28.967322 master-2 kubenswrapper[4762]: E1014 13:06:28.967205 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.068193 master-2 kubenswrapper[4762]: E1014 13:06:29.068034 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.169011 master-2 kubenswrapper[4762]: E1014 13:06:29.168870 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.269443 master-2 kubenswrapper[4762]: E1014 13:06:29.269338 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.370521 master-2 kubenswrapper[4762]: E1014 13:06:29.370424 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.471283 master-2 kubenswrapper[4762]: E1014 13:06:29.471051 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.571313 master-2 kubenswrapper[4762]: E1014 13:06:29.571236 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.672221 master-2 kubenswrapper[4762]: E1014 13:06:29.672107 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.773303 master-2 kubenswrapper[4762]: E1014 13:06:29.773114 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.874223 master-2 kubenswrapper[4762]: E1014 13:06:29.874086 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:29.974911 master-2 kubenswrapper[4762]: E1014 13:06:29.974769 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.075835 master-2 kubenswrapper[4762]: E1014 13:06:30.075661 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.176889 master-2 kubenswrapper[4762]: E1014 13:06:30.176767 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.277311 master-2 kubenswrapper[4762]: E1014 13:06:30.277144 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.377508 master-2 kubenswrapper[4762]: E1014 13:06:30.377306 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.477790 master-2 kubenswrapper[4762]: E1014 13:06:30.477633 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.548350 master-2 kubenswrapper[4762]: I1014 13:06:30.548240 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:30.550191 master-2 kubenswrapper[4762]: I1014 13:06:30.550044 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:30.550191 master-2 kubenswrapper[4762]: I1014 13:06:30.550128 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:30.550191 master-2 kubenswrapper[4762]: I1014 13:06:30.550176 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:30.551074 master-2 kubenswrapper[4762]: I1014 13:06:30.551010 4762 scope.go:117] "RemoveContainer" containerID="d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062" Oct 14 13:06:30.578614 master-2 kubenswrapper[4762]: E1014 13:06:30.578494 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.679802 master-2 kubenswrapper[4762]: E1014 13:06:30.679716 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.780185 master-2 kubenswrapper[4762]: E1014 13:06:30.780065 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.880348 master-2 kubenswrapper[4762]: E1014 13:06:30.880273 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:30.981358 master-2 kubenswrapper[4762]: E1014 13:06:30.981119 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.082015 master-2 kubenswrapper[4762]: E1014 13:06:31.081857 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.182879 master-2 kubenswrapper[4762]: E1014 13:06:31.182759 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.283458 master-2 kubenswrapper[4762]: E1014 13:06:31.283268 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.384321 master-2 kubenswrapper[4762]: E1014 13:06:31.384212 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.485354 master-2 kubenswrapper[4762]: E1014 13:06:31.485258 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.522893 master-2 kubenswrapper[4762]: E1014 13:06:31.522803 4762 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-2\" not found" Oct 14 13:06:31.586391 master-2 kubenswrapper[4762]: E1014 13:06:31.586201 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.611646 master-2 kubenswrapper[4762]: I1014 13:06:31.611551 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/2.log" Oct 14 13:06:31.612336 master-2 kubenswrapper[4762]: I1014 13:06:31.612287 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/1.log" Oct 14 13:06:31.612949 master-2 kubenswrapper[4762]: I1014 13:06:31.612878 4762 generic.go:334] "Generic (PLEG): container finished" podID="f022eff2d978fee6b366ac18a80aa53c" containerID="367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10" exitCode=1 Oct 14 13:06:31.613040 master-2 kubenswrapper[4762]: I1014 13:06:31.612944 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerDied","Data":"367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10"} Oct 14 13:06:31.613040 master-2 kubenswrapper[4762]: I1014 13:06:31.613018 4762 scope.go:117] "RemoveContainer" containerID="d57e73517a8640d4c26b28b54a4cf8ecf415a672a1b2341868ce34d0d2510062" Oct 14 13:06:31.613193 master-2 kubenswrapper[4762]: I1014 13:06:31.613121 4762 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 14 13:06:31.614289 master-2 kubenswrapper[4762]: I1014 13:06:31.614241 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientMemory" Oct 14 13:06:31.614289 master-2 kubenswrapper[4762]: I1014 13:06:31.614287 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasNoDiskPressure" Oct 14 13:06:31.614450 master-2 kubenswrapper[4762]: I1014 13:06:31.614311 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeHasSufficientPID" Oct 14 13:06:31.634299 master-2 kubenswrapper[4762]: I1014 13:06:31.634201 4762 scope.go:117] "RemoveContainer" containerID="367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10" Oct 14 13:06:31.634614 master-2 kubenswrapper[4762]: E1014 13:06:31.634487 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podUID="f022eff2d978fee6b366ac18a80aa53c" Oct 14 13:06:31.686668 master-2 kubenswrapper[4762]: E1014 13:06:31.686572 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.787052 master-2 kubenswrapper[4762]: E1014 13:06:31.786934 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.888340 master-2 kubenswrapper[4762]: E1014 13:06:31.888175 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:31.989347 master-2 kubenswrapper[4762]: E1014 13:06:31.989239 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:32.090185 master-2 kubenswrapper[4762]: E1014 13:06:32.090101 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:32.191208 master-2 kubenswrapper[4762]: E1014 13:06:32.191075 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:32.291858 master-2 kubenswrapper[4762]: E1014 13:06:32.291714 4762 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-2\" not found" Oct 14 13:06:32.382988 master-2 kubenswrapper[4762]: I1014 13:06:32.382871 4762 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 14 13:06:32.397754 master-2 kubenswrapper[4762]: I1014 13:06:32.397662 4762 apiserver.go:52] "Watching apiserver" Oct 14 13:06:32.402177 master-2 kubenswrapper[4762]: I1014 13:06:32.402095 4762 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 14 13:06:32.402501 master-2 kubenswrapper[4762]: I1014 13:06:32.402430 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t","openshift-cluster-version/cluster-version-operator-55bd67947c-872k9"] Oct 14 13:06:32.402962 master-2 kubenswrapper[4762]: I1014 13:06:32.402898 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.403087 master-2 kubenswrapper[4762]: I1014 13:06:32.402992 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.405590 master-2 kubenswrapper[4762]: I1014 13:06:32.405537 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 14 13:06:32.406751 master-2 kubenswrapper[4762]: I1014 13:06:32.406630 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 14 13:06:32.407284 master-2 kubenswrapper[4762]: I1014 13:06:32.407203 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Oct 14 13:06:32.407419 master-2 kubenswrapper[4762]: I1014 13:06:32.407345 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Oct 14 13:06:32.408340 master-2 kubenswrapper[4762]: I1014 13:06:32.408286 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Oct 14 13:06:32.408905 master-2 kubenswrapper[4762]: I1014 13:06:32.408859 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Oct 14 13:06:32.408996 master-2 kubenswrapper[4762]: I1014 13:06:32.408913 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Oct 14 13:06:32.408996 master-2 kubenswrapper[4762]: I1014 13:06:32.408922 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 14 13:06:32.489116 master-2 kubenswrapper[4762]: I1014 13:06:32.489013 4762 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Oct 14 13:06:32.518696 master-2 kubenswrapper[4762]: I1014 13:06:32.518600 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.518696 master-2 kubenswrapper[4762]: I1014 13:06:32.518664 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518704 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-service-ca\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518734 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-kube-api-access\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518767 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518797 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518879 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.518941 master-2 kubenswrapper[4762]: I1014 13:06:32.518933 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpx9\" (UniqueName: \"kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.519335 master-2 kubenswrapper[4762]: I1014 13:06:32.518984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-ssl-certs\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.519335 master-2 kubenswrapper[4762]: I1014 13:06:32.519020 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.616852 master-2 kubenswrapper[4762]: I1014 13:06:32.616574 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/2.log" Oct 14 13:06:32.620006 master-2 kubenswrapper[4762]: I1014 13:06:32.619927 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-kube-api-access\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620128 master-2 kubenswrapper[4762]: I1014 13:06:32.620030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620128 master-2 kubenswrapper[4762]: I1014 13:06:32.620087 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620294 master-2 kubenswrapper[4762]: I1014 13:06:32.620136 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-service-ca\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620294 master-2 kubenswrapper[4762]: I1014 13:06:32.620210 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620294 master-2 kubenswrapper[4762]: I1014 13:06:32.620257 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620460 master-2 kubenswrapper[4762]: I1014 13:06:32.620299 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620460 master-2 kubenswrapper[4762]: I1014 13:06:32.620315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620460 master-2 kubenswrapper[4762]: I1014 13:06:32.620332 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620460 master-2 kubenswrapper[4762]: I1014 13:06:32.620424 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpx9\" (UniqueName: \"kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.620671 master-2 kubenswrapper[4762]: I1014 13:06:32.620465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-ssl-certs\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620671 master-2 kubenswrapper[4762]: I1014 13:06:32.620531 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-ssl-certs\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.620671 master-2 kubenswrapper[4762]: I1014 13:06:32.620580 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.621116 master-2 kubenswrapper[4762]: E1014 13:06:32.621062 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:32.621481 master-2 kubenswrapper[4762]: I1014 13:06:32.621412 4762 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 14 13:06:32.621619 master-2 kubenswrapper[4762]: E1014 13:06:32.621460 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:06:33.121416196 +0000 UTC m=+22.365575385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:32.621707 master-2 kubenswrapper[4762]: I1014 13:06:32.621662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.622361 master-2 kubenswrapper[4762]: I1014 13:06:32.622311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-service-ca\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.622441 master-2 kubenswrapper[4762]: I1014 13:06:32.622378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.629966 master-2 kubenswrapper[4762]: I1014 13:06:32.629570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.652143 master-2 kubenswrapper[4762]: I1014 13:06:32.652023 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-kube-api-access\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:32.653337 master-2 kubenswrapper[4762]: I1014 13:06:32.653282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpx9\" (UniqueName: \"kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9\") pod \"cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.751975 master-2 kubenswrapper[4762]: I1014 13:06:32.751812 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:32.764607 master-2 kubenswrapper[4762]: W1014 13:06:32.764550 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78911cd1_bf7b_4ba1_8993_c10b848879bd.slice/crio-5afe5cd17b909e9902ab70317bf8dae6956def9187352e41e5391349ba88e715 WatchSource:0}: Error finding container 5afe5cd17b909e9902ab70317bf8dae6956def9187352e41e5391349ba88e715: Status 404 returned error can't find the container with id 5afe5cd17b909e9902ab70317bf8dae6956def9187352e41e5391349ba88e715 Oct 14 13:06:33.123964 master-2 kubenswrapper[4762]: I1014 13:06:33.123826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:33.123964 master-2 kubenswrapper[4762]: E1014 13:06:33.123947 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:33.124147 master-2 kubenswrapper[4762]: E1014 13:06:33.124021 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:06:34.124002438 +0000 UTC m=+23.368161597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:33.621332 master-2 kubenswrapper[4762]: I1014 13:06:33.621260 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerStarted","Data":"5afe5cd17b909e9902ab70317bf8dae6956def9187352e41e5391349ba88e715"} Oct 14 13:06:34.130443 master-2 kubenswrapper[4762]: I1014 13:06:34.130300 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:34.131516 master-2 kubenswrapper[4762]: E1014 13:06:34.130580 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:34.131516 master-2 kubenswrapper[4762]: E1014 13:06:34.130776 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:06:36.130733692 +0000 UTC m=+25.374892901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:36.143946 master-2 kubenswrapper[4762]: I1014 13:06:36.143873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:36.145003 master-2 kubenswrapper[4762]: E1014 13:06:36.144035 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:36.145003 master-2 kubenswrapper[4762]: E1014 13:06:36.144105 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:06:40.144082656 +0000 UTC m=+29.388241845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:36.630945 master-2 kubenswrapper[4762]: I1014 13:06:36.630836 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/0.log" Oct 14 13:06:36.632413 master-2 kubenswrapper[4762]: I1014 13:06:36.632356 4762 generic.go:334] "Generic (PLEG): container finished" podID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerID="f762eb4a000e4d70830767109a178b8ed9537ed63f62a170c2c9884479fa272d" exitCode=1 Oct 14 13:06:36.632538 master-2 kubenswrapper[4762]: I1014 13:06:36.632412 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"f762eb4a000e4d70830767109a178b8ed9537ed63f62a170c2c9884479fa272d"} Oct 14 13:06:36.632538 master-2 kubenswrapper[4762]: I1014 13:06:36.632450 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerStarted","Data":"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269"} Oct 14 13:06:36.632538 master-2 kubenswrapper[4762]: I1014 13:06:36.632470 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerStarted","Data":"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd"} Oct 14 13:06:36.633067 master-2 kubenswrapper[4762]: I1014 13:06:36.633017 4762 scope.go:117] "RemoveContainer" containerID="f762eb4a000e4d70830767109a178b8ed9537ed63f62a170c2c9884479fa272d" Oct 14 13:06:37.636972 master-2 kubenswrapper[4762]: I1014 13:06:37.636882 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/1.log" Oct 14 13:06:37.637963 master-2 kubenswrapper[4762]: I1014 13:06:37.637912 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/0.log" Oct 14 13:06:37.639232 master-2 kubenswrapper[4762]: I1014 13:06:37.639185 4762 generic.go:334] "Generic (PLEG): container finished" podID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerID="d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e" exitCode=1 Oct 14 13:06:37.639326 master-2 kubenswrapper[4762]: I1014 13:06:37.639240 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e"} Oct 14 13:06:37.639326 master-2 kubenswrapper[4762]: I1014 13:06:37.639298 4762 scope.go:117] "RemoveContainer" containerID="f762eb4a000e4d70830767109a178b8ed9537ed63f62a170c2c9884479fa272d" Oct 14 13:06:37.639982 master-2 kubenswrapper[4762]: I1014 13:06:37.639927 4762 scope.go:117] "RemoveContainer" containerID="d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e" Oct 14 13:06:37.640381 master-2 kubenswrapper[4762]: E1014 13:06:37.640230 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_openshift-cloud-controller-manager-operator(78911cd1-bf7b-4ba1-8993-c10b848879bd)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" Oct 14 13:06:38.643471 master-2 kubenswrapper[4762]: I1014 13:06:38.643374 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/1.log" Oct 14 13:06:38.644865 master-2 kubenswrapper[4762]: I1014 13:06:38.644807 4762 scope.go:117] "RemoveContainer" containerID="d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e" Oct 14 13:06:38.645070 master-2 kubenswrapper[4762]: E1014 13:06:38.645019 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_openshift-cloud-controller-manager-operator(78911cd1-bf7b-4ba1-8993-c10b848879bd)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" Oct 14 13:06:40.171936 master-2 kubenswrapper[4762]: I1014 13:06:40.171781 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:40.172695 master-2 kubenswrapper[4762]: E1014 13:06:40.172026 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:40.172695 master-2 kubenswrapper[4762]: E1014 13:06:40.172124 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:06:48.172098915 +0000 UTC m=+37.416258114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:40.346116 master-2 kubenswrapper[4762]: I1014 13:06:40.346003 4762 csr.go:261] certificate signing request csr-jtsv2 is approved, waiting to be issued Oct 14 13:06:40.354595 master-2 kubenswrapper[4762]: I1014 13:06:40.354536 4762 csr.go:257] certificate signing request csr-jtsv2 is issued Oct 14 13:06:41.355953 master-2 kubenswrapper[4762]: I1014 13:06:41.355852 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-10-15 13:01:17 +0000 UTC, rotation deadline is 2025-10-15 07:30:34.268117252 +0000 UTC Oct 14 13:06:41.355953 master-2 kubenswrapper[4762]: I1014 13:06:41.355901 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h23m52.912220691s for next certificate rotation Oct 14 13:06:42.357224 master-2 kubenswrapper[4762]: I1014 13:06:42.357030 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-10-15 13:01:17 +0000 UTC, rotation deadline is 2025-10-15 06:31:06.202996655 +0000 UTC Oct 14 13:06:42.357224 master-2 kubenswrapper[4762]: I1014 13:06:42.357125 4762 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h24m23.845877013s for next certificate rotation Oct 14 13:06:43.557074 master-2 kubenswrapper[4762]: I1014 13:06:43.556974 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-2"] Oct 14 13:06:43.558123 master-2 kubenswrapper[4762]: I1014 13:06:43.557590 4762 scope.go:117] "RemoveContainer" containerID="367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10" Oct 14 13:06:43.558123 master-2 kubenswrapper[4762]: E1014 13:06:43.557848 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podUID="f022eff2d978fee6b366ac18a80aa53c" Oct 14 13:06:43.655456 master-2 kubenswrapper[4762]: I1014 13:06:43.655406 4762 scope.go:117] "RemoveContainer" containerID="367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10" Oct 14 13:06:43.655687 master-2 kubenswrapper[4762]: E1014 13:06:43.655652 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-2_openshift-machine-config-operator(f022eff2d978fee6b366ac18a80aa53c)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podUID="f022eff2d978fee6b366ac18a80aa53c" Oct 14 13:06:48.229479 master-2 kubenswrapper[4762]: I1014 13:06:48.229366 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:06:48.230559 master-2 kubenswrapper[4762]: E1014 13:06:48.229519 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:48.230559 master-2 kubenswrapper[4762]: E1014 13:06:48.229598 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:07:04.229575838 +0000 UTC m=+53.473735037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:06:49.548397 master-2 kubenswrapper[4762]: I1014 13:06:49.548273 4762 scope.go:117] "RemoveContainer" containerID="d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e" Oct 14 13:06:50.672012 master-2 kubenswrapper[4762]: I1014 13:06:50.671944 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/2.log" Oct 14 13:06:50.672899 master-2 kubenswrapper[4762]: I1014 13:06:50.672793 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/1.log" Oct 14 13:06:50.674368 master-2 kubenswrapper[4762]: I1014 13:06:50.674307 4762 generic.go:334] "Generic (PLEG): container finished" podID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" exitCode=1 Oct 14 13:06:50.674368 master-2 kubenswrapper[4762]: I1014 13:06:50.674361 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f"} Oct 14 13:06:50.674602 master-2 kubenswrapper[4762]: I1014 13:06:50.674412 4762 scope.go:117] "RemoveContainer" containerID="d31813a0fbbcc8471b53d19b8eb9b565103feec91d7bad7f0a63c3efcf28b16e" Oct 14 13:06:50.675253 master-2 kubenswrapper[4762]: I1014 13:06:50.675186 4762 scope.go:117] "RemoveContainer" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" Oct 14 13:06:50.675546 master-2 kubenswrapper[4762]: E1014 13:06:50.675487 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_openshift-cloud-controller-manager-operator(78911cd1-bf7b-4ba1-8993-c10b848879bd)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" Oct 14 13:06:51.679415 master-2 kubenswrapper[4762]: I1014 13:06:51.679318 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/2.log" Oct 14 13:06:53.384830 master-2 kubenswrapper[4762]: I1014 13:06:53.384714 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t"] Oct 14 13:06:53.385984 master-2 kubenswrapper[4762]: I1014 13:06:53.384896 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="cluster-cloud-controller-manager" containerID="cri-o://a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" gracePeriod=30 Oct 14 13:06:53.385984 master-2 kubenswrapper[4762]: I1014 13:06:53.384977 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="config-sync-controllers" containerID="cri-o://898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" gracePeriod=30 Oct 14 13:06:53.519780 master-2 kubenswrapper[4762]: I1014 13:06:53.519744 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/2.log" Oct 14 13:06:53.520756 master-2 kubenswrapper[4762]: I1014 13:06:53.520736 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:53.568474 master-2 kubenswrapper[4762]: I1014 13:06:53.568358 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube\") pod \"78911cd1-bf7b-4ba1-8993-c10b848879bd\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " Oct 14 13:06:53.568474 master-2 kubenswrapper[4762]: I1014 13:06:53.568456 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config\") pod \"78911cd1-bf7b-4ba1-8993-c10b848879bd\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " Oct 14 13:06:53.568474 master-2 kubenswrapper[4762]: I1014 13:06:53.568469 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "78911cd1-bf7b-4ba1-8993-c10b848879bd" (UID: "78911cd1-bf7b-4ba1-8993-c10b848879bd"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:06:53.568840 master-2 kubenswrapper[4762]: I1014 13:06:53.568576 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqpx9\" (UniqueName: \"kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9\") pod \"78911cd1-bf7b-4ba1-8993-c10b848879bd\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " Oct 14 13:06:53.568840 master-2 kubenswrapper[4762]: I1014 13:06:53.568633 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls\") pod \"78911cd1-bf7b-4ba1-8993-c10b848879bd\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " Oct 14 13:06:53.568840 master-2 kubenswrapper[4762]: I1014 13:06:53.568685 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images\") pod \"78911cd1-bf7b-4ba1-8993-c10b848879bd\" (UID: \"78911cd1-bf7b-4ba1-8993-c10b848879bd\") " Oct 14 13:06:53.568840 master-2 kubenswrapper[4762]: I1014 13:06:53.568773 4762 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/78911cd1-bf7b-4ba1-8993-c10b848879bd-host-etc-kube\") on node \"master-2\" DevicePath \"\"" Oct 14 13:06:53.569522 master-2 kubenswrapper[4762]: I1014 13:06:53.569054 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "78911cd1-bf7b-4ba1-8993-c10b848879bd" (UID: "78911cd1-bf7b-4ba1-8993-c10b848879bd"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:06:53.569610 master-2 kubenswrapper[4762]: I1014 13:06:53.569555 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images" (OuterVolumeSpecName: "images") pod "78911cd1-bf7b-4ba1-8993-c10b848879bd" (UID: "78911cd1-bf7b-4ba1-8993-c10b848879bd"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:06:53.572881 master-2 kubenswrapper[4762]: I1014 13:06:53.572838 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "78911cd1-bf7b-4ba1-8993-c10b848879bd" (UID: "78911cd1-bf7b-4ba1-8993-c10b848879bd"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:06:53.573136 master-2 kubenswrapper[4762]: I1014 13:06:53.573119 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9" (OuterVolumeSpecName: "kube-api-access-tqpx9") pod "78911cd1-bf7b-4ba1-8993-c10b848879bd" (UID: "78911cd1-bf7b-4ba1-8993-c10b848879bd"). InnerVolumeSpecName "kube-api-access-tqpx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:06:53.669693 master-2 kubenswrapper[4762]: I1014 13:06:53.669626 4762 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/78911cd1-bf7b-4ba1-8993-c10b848879bd-cloud-controller-manager-operator-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:06:53.669693 master-2 kubenswrapper[4762]: I1014 13:06:53.669667 4762 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-images\") on node \"master-2\" DevicePath \"\"" Oct 14 13:06:53.669693 master-2 kubenswrapper[4762]: I1014 13:06:53.669683 4762 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/78911cd1-bf7b-4ba1-8993-c10b848879bd-auth-proxy-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:06:53.669693 master-2 kubenswrapper[4762]: I1014 13:06:53.669694 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqpx9\" (UniqueName: \"kubernetes.io/projected/78911cd1-bf7b-4ba1-8993-c10b848879bd-kube-api-access-tqpx9\") on node \"master-2\" DevicePath \"\"" Oct 14 13:06:53.686071 master-2 kubenswrapper[4762]: I1014 13:06:53.685603 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t_78911cd1-bf7b-4ba1-8993-c10b848879bd/kube-rbac-proxy/2.log" Oct 14 13:06:53.687608 master-2 kubenswrapper[4762]: I1014 13:06:53.687576 4762 generic.go:334] "Generic (PLEG): container finished" podID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerID="898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" exitCode=0 Oct 14 13:06:53.687809 master-2 kubenswrapper[4762]: I1014 13:06:53.687775 4762 generic.go:334] "Generic (PLEG): container finished" podID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerID="a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" exitCode=0 Oct 14 13:06:53.687963 master-2 kubenswrapper[4762]: I1014 13:06:53.687700 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" Oct 14 13:06:53.688257 master-2 kubenswrapper[4762]: I1014 13:06:53.687650 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269"} Oct 14 13:06:53.688386 master-2 kubenswrapper[4762]: I1014 13:06:53.688290 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd"} Oct 14 13:06:53.688386 master-2 kubenswrapper[4762]: I1014 13:06:53.688343 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t" event={"ID":"78911cd1-bf7b-4ba1-8993-c10b848879bd","Type":"ContainerDied","Data":"5afe5cd17b909e9902ab70317bf8dae6956def9187352e41e5391349ba88e715"} Oct 14 13:06:53.688386 master-2 kubenswrapper[4762]: I1014 13:06:53.688380 4762 scope.go:117] "RemoveContainer" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" Oct 14 13:06:53.710133 master-2 kubenswrapper[4762]: I1014 13:06:53.709982 4762 scope.go:117] "RemoveContainer" containerID="898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" Oct 14 13:06:53.723329 master-2 kubenswrapper[4762]: I1014 13:06:53.723280 4762 scope.go:117] "RemoveContainer" containerID="a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" Oct 14 13:06:53.723653 master-2 kubenswrapper[4762]: I1014 13:06:53.723623 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t"] Oct 14 13:06:53.725634 master-2 kubenswrapper[4762]: I1014 13:06:53.725605 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6d4bdff5b8-xzw9t"] Oct 14 13:06:53.737201 master-2 kubenswrapper[4762]: I1014 13:06:53.737129 4762 scope.go:117] "RemoveContainer" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" Oct 14 13:06:53.738287 master-2 kubenswrapper[4762]: E1014 13:06:53.738248 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f\": container with ID starting with 196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f not found: ID does not exist" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" Oct 14 13:06:53.738409 master-2 kubenswrapper[4762]: I1014 13:06:53.738276 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f"} err="failed to get container status \"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f\": rpc error: code = NotFound desc = could not find container \"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f\": container with ID starting with 196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f not found: ID does not exist" Oct 14 13:06:53.738409 master-2 kubenswrapper[4762]: I1014 13:06:53.738314 4762 scope.go:117] "RemoveContainer" containerID="898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" Oct 14 13:06:53.739110 master-2 kubenswrapper[4762]: E1014 13:06:53.739025 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269\": container with ID starting with 898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269 not found: ID does not exist" containerID="898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" Oct 14 13:06:53.739241 master-2 kubenswrapper[4762]: I1014 13:06:53.739096 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269"} err="failed to get container status \"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269\": rpc error: code = NotFound desc = could not find container \"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269\": container with ID starting with 898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269 not found: ID does not exist" Oct 14 13:06:53.739241 master-2 kubenswrapper[4762]: I1014 13:06:53.739144 4762 scope.go:117] "RemoveContainer" containerID="a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" Oct 14 13:06:53.739912 master-2 kubenswrapper[4762]: E1014 13:06:53.739837 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd\": container with ID starting with a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd not found: ID does not exist" containerID="a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" Oct 14 13:06:53.740005 master-2 kubenswrapper[4762]: I1014 13:06:53.739903 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd"} err="failed to get container status \"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd\": rpc error: code = NotFound desc = could not find container \"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd\": container with ID starting with a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd not found: ID does not exist" Oct 14 13:06:53.740005 master-2 kubenswrapper[4762]: I1014 13:06:53.739941 4762 scope.go:117] "RemoveContainer" containerID="196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f" Oct 14 13:06:53.740581 master-2 kubenswrapper[4762]: I1014 13:06:53.740516 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f"} err="failed to get container status \"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f\": rpc error: code = NotFound desc = could not find container \"196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f\": container with ID starting with 196fae6a8244bb03678e6f2ecff257ccc5804b8418dfd20b28ef66491296f03f not found: ID does not exist" Oct 14 13:06:53.740581 master-2 kubenswrapper[4762]: I1014 13:06:53.740564 4762 scope.go:117] "RemoveContainer" containerID="898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269" Oct 14 13:06:53.741074 master-2 kubenswrapper[4762]: I1014 13:06:53.741005 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269"} err="failed to get container status \"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269\": rpc error: code = NotFound desc = could not find container \"898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269\": container with ID starting with 898d757cca76af9c295c7eb6919872a57aabd75e8f633ebe984c49b09116c269 not found: ID does not exist" Oct 14 13:06:53.741074 master-2 kubenswrapper[4762]: I1014 13:06:53.741051 4762 scope.go:117] "RemoveContainer" containerID="a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd" Oct 14 13:06:53.741648 master-2 kubenswrapper[4762]: I1014 13:06:53.741585 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd"} err="failed to get container status \"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd\": rpc error: code = NotFound desc = could not find container \"a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd\": container with ID starting with a94edc88f3c88aad12211037bdd42252efc93cbaacc71bfb874d7094a904dfcd not found: ID does not exist" Oct 14 13:06:53.754098 master-2 kubenswrapper[4762]: I1014 13:06:53.754049 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5"] Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: E1014 13:06:53.754128 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="cluster-cloud-controller-manager" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: I1014 13:06:53.754181 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="cluster-cloud-controller-manager" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: E1014 13:06:53.754198 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: I1014 13:06:53.754212 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: E1014 13:06:53.754225 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: I1014 13:06:53.754237 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754254 master-2 kubenswrapper[4762]: E1014 13:06:53.754253 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="config-sync-controllers" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754267 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="config-sync-controllers" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754325 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="cluster-cloud-controller-manager" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754340 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="config-sync-controllers" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754353 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754364 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754376 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: E1014 13:06:53.754403 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754610 master-2 kubenswrapper[4762]: I1014 13:06:53.754419 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" containerName="kube-rbac-proxy" Oct 14 13:06:53.754999 master-2 kubenswrapper[4762]: I1014 13:06:53.754682 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.757820 master-2 kubenswrapper[4762]: I1014 13:06:53.757764 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Oct 14 13:06:53.758324 master-2 kubenswrapper[4762]: I1014 13:06:53.758273 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Oct 14 13:06:53.759014 master-2 kubenswrapper[4762]: I1014 13:06:53.758966 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Oct 14 13:06:53.759014 master-2 kubenswrapper[4762]: I1014 13:06:53.758990 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Oct 14 13:06:53.759400 master-2 kubenswrapper[4762]: I1014 13:06:53.759359 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Oct 14 13:06:53.871212 master-2 kubenswrapper[4762]: I1014 13:06:53.871111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.871212 master-2 kubenswrapper[4762]: I1014 13:06:53.871211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/18346e46-a062-4e0d-b90a-c05646a46c7e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.871492 master-2 kubenswrapper[4762]: I1014 13:06:53.871238 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzvhm\" (UniqueName: \"kubernetes.io/projected/18346e46-a062-4e0d-b90a-c05646a46c7e-kube-api-access-xzvhm\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.871492 master-2 kubenswrapper[4762]: I1014 13:06:53.871265 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-images\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.871492 master-2 kubenswrapper[4762]: I1014 13:06:53.871355 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/18346e46-a062-4e0d-b90a-c05646a46c7e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.972740 master-2 kubenswrapper[4762]: I1014 13:06:53.972516 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/18346e46-a062-4e0d-b90a-c05646a46c7e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.972740 master-2 kubenswrapper[4762]: I1014 13:06:53.972592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-images\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.972740 master-2 kubenswrapper[4762]: I1014 13:06:53.972636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.972740 master-2 kubenswrapper[4762]: I1014 13:06:53.972668 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/18346e46-a062-4e0d-b90a-c05646a46c7e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.972740 master-2 kubenswrapper[4762]: I1014 13:06:53.972673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/18346e46-a062-4e0d-b90a-c05646a46c7e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.973264 master-2 kubenswrapper[4762]: I1014 13:06:53.972761 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzvhm\" (UniqueName: \"kubernetes.io/projected/18346e46-a062-4e0d-b90a-c05646a46c7e-kube-api-access-xzvhm\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.974087 master-2 kubenswrapper[4762]: I1014 13:06:53.974018 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.974252 master-2 kubenswrapper[4762]: I1014 13:06:53.974126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/18346e46-a062-4e0d-b90a-c05646a46c7e-images\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.977853 master-2 kubenswrapper[4762]: I1014 13:06:53.977792 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/18346e46-a062-4e0d-b90a-c05646a46c7e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:53.994491 master-2 kubenswrapper[4762]: I1014 13:06:53.994378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzvhm\" (UniqueName: \"kubernetes.io/projected/18346e46-a062-4e0d-b90a-c05646a46c7e-kube-api-access-xzvhm\") pod \"cluster-cloud-controller-manager-operator-779749f859-bscv5\" (UID: \"18346e46-a062-4e0d-b90a-c05646a46c7e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:54.089617 master-2 kubenswrapper[4762]: I1014 13:06:54.071585 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" Oct 14 13:06:54.091049 master-2 kubenswrapper[4762]: W1014 13:06:54.090951 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18346e46_a062_4e0d_b90a_c05646a46c7e.slice/crio-f7ed18e5adf33ce90b6ba81441ada7b71c5785e692f8595005abfeaaff92d8cb WatchSource:0}: Error finding container f7ed18e5adf33ce90b6ba81441ada7b71c5785e692f8595005abfeaaff92d8cb: Status 404 returned error can't find the container with id f7ed18e5adf33ce90b6ba81441ada7b71c5785e692f8595005abfeaaff92d8cb Oct 14 13:06:54.695678 master-2 kubenswrapper[4762]: I1014 13:06:54.695613 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerStarted","Data":"18f5eac455c3f97c664881c24b67138f1d4f342782bcadba7e8667d46225fa69"} Oct 14 13:06:54.695678 master-2 kubenswrapper[4762]: I1014 13:06:54.695689 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerStarted","Data":"740bf5bc7b0a5bb65f33f5bf2768c3ea3455ba88946d432451086f51c7e43364"} Oct 14 13:06:54.695678 master-2 kubenswrapper[4762]: I1014 13:06:54.695716 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerStarted","Data":"f7ed18e5adf33ce90b6ba81441ada7b71c5785e692f8595005abfeaaff92d8cb"} Oct 14 13:06:55.553050 master-2 kubenswrapper[4762]: I1014 13:06:55.552967 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78911cd1-bf7b-4ba1-8993-c10b848879bd" path="/var/lib/kubelet/pods/78911cd1-bf7b-4ba1-8993-c10b848879bd/volumes" Oct 14 13:06:55.700468 master-2 kubenswrapper[4762]: I1014 13:06:55.700399 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/0.log" Oct 14 13:06:55.701619 master-2 kubenswrapper[4762]: I1014 13:06:55.701527 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="b9373d85a50853ef08896d6bec65bf1c3471282faba1fd0ec79fd774ec163542" exitCode=1 Oct 14 13:06:55.701714 master-2 kubenswrapper[4762]: I1014 13:06:55.701622 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"b9373d85a50853ef08896d6bec65bf1c3471282faba1fd0ec79fd774ec163542"} Oct 14 13:06:55.702202 master-2 kubenswrapper[4762]: I1014 13:06:55.702141 4762 scope.go:117] "RemoveContainer" containerID="b9373d85a50853ef08896d6bec65bf1c3471282faba1fd0ec79fd774ec163542" Oct 14 13:06:56.706380 master-2 kubenswrapper[4762]: I1014 13:06:56.706289 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/1.log" Oct 14 13:06:56.707809 master-2 kubenswrapper[4762]: I1014 13:06:56.707739 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/0.log" Oct 14 13:06:56.709367 master-2 kubenswrapper[4762]: I1014 13:06:56.709307 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98" exitCode=1 Oct 14 13:06:56.709367 master-2 kubenswrapper[4762]: I1014 13:06:56.709360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98"} Oct 14 13:06:56.709555 master-2 kubenswrapper[4762]: I1014 13:06:56.709400 4762 scope.go:117] "RemoveContainer" containerID="b9373d85a50853ef08896d6bec65bf1c3471282faba1fd0ec79fd774ec163542" Oct 14 13:06:56.710362 master-2 kubenswrapper[4762]: I1014 13:06:56.710312 4762 scope.go:117] "RemoveContainer" containerID="410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98" Oct 14 13:06:56.710617 master-2 kubenswrapper[4762]: E1014 13:06:56.710567 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:06:57.714198 master-2 kubenswrapper[4762]: I1014 13:06:57.714088 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/1.log" Oct 14 13:06:57.715969 master-2 kubenswrapper[4762]: I1014 13:06:57.715920 4762 scope.go:117] "RemoveContainer" containerID="410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98" Oct 14 13:06:57.716210 master-2 kubenswrapper[4762]: E1014 13:06:57.716132 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:06:58.548908 master-2 kubenswrapper[4762]: I1014 13:06:58.548794 4762 scope.go:117] "RemoveContainer" containerID="367f78b1ba05a1a5cac5a0f737b5e254b59c152b3ac8ca5d5714f3d50abacd10" Oct 14 13:06:59.724977 master-2 kubenswrapper[4762]: I1014 13:06:59.724932 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-2_f022eff2d978fee6b366ac18a80aa53c/kube-rbac-proxy-crio/2.log" Oct 14 13:06:59.726014 master-2 kubenswrapper[4762]: I1014 13:06:59.725956 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" event={"ID":"f022eff2d978fee6b366ac18a80aa53c","Type":"ContainerStarted","Data":"0f9381cf8b3e1bdd572687ff62b6ce00cddd48fdf9f8b7c981285b21fd41f3a0"} Oct 14 13:06:59.742219 master-2 kubenswrapper[4762]: I1014 13:06:59.742090 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-2" podStartSLOduration=16.742061535 podStartE2EDuration="16.742061535s" podCreationTimestamp="2025-10-14 13:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:06:59.742036534 +0000 UTC m=+48.986195783" watchObservedRunningTime="2025-10-14 13:06:59.742061535 +0000 UTC m=+48.986220734" Oct 14 13:07:04.246044 master-2 kubenswrapper[4762]: I1014 13:07:04.245929 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:07:04.247286 master-2 kubenswrapper[4762]: E1014 13:07:04.246094 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:07:04.247286 master-2 kubenswrapper[4762]: E1014 13:07:04.246227 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:07:36.24620092 +0000 UTC m=+85.490360109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:07:08.874595 master-2 kubenswrapper[4762]: I1014 13:07:08.874478 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-g75pn"] Oct 14 13:07:08.875580 master-2 kubenswrapper[4762]: I1014 13:07:08.874791 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.877434 master-2 kubenswrapper[4762]: I1014 13:07:08.877356 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 14 13:07:08.877657 master-2 kubenswrapper[4762]: I1014 13:07:08.877563 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 14 13:07:08.877947 master-2 kubenswrapper[4762]: I1014 13:07:08.877894 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 14 13:07:08.878034 master-2 kubenswrapper[4762]: I1014 13:07:08.877943 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 14 13:07:08.979741 master-2 kubenswrapper[4762]: I1014 13:07:08.979618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-cnibin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.979741 master-2 kubenswrapper[4762]: I1014 13:07:08.979698 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-k8s-cni-cncf-io\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.979741 master-2 kubenswrapper[4762]: I1014 13:07:08.979740 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvr26\" (UniqueName: \"kubernetes.io/projected/23dff85b-806e-49e9-8913-cf308d64e7b8-kube-api-access-wvr26\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979775 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979809 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-etc-kubernetes\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979840 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-os-release\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979909 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-socket-dir-parent\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979946 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-kubelet\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.979978 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-conf-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.980009 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-daemon-config\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.980039 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-system-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.980069 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-netns\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980130 master-2 kubenswrapper[4762]: I1014 13:07:08.980103 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-multus\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980833 master-2 kubenswrapper[4762]: I1014 13:07:08.980201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-cni-binary-copy\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980833 master-2 kubenswrapper[4762]: I1014 13:07:08.980267 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-multus-certs\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980833 master-2 kubenswrapper[4762]: I1014 13:07:08.980301 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-bin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:08.980833 master-2 kubenswrapper[4762]: I1014 13:07:08.980331 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-hostroot\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.066124 master-2 kubenswrapper[4762]: I1014 13:07:09.066015 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mgfql"] Oct 14 13:07:09.066707 master-2 kubenswrapper[4762]: I1014 13:07:09.066621 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.070507 master-2 kubenswrapper[4762]: I1014 13:07:09.070450 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 14 13:07:09.071240 master-2 kubenswrapper[4762]: I1014 13:07:09.071110 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Oct 14 13:07:09.081042 master-2 kubenswrapper[4762]: I1014 13:07:09.080887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-cnibin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081042 master-2 kubenswrapper[4762]: I1014 13:07:09.080982 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-k8s-cni-cncf-io\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081042 master-2 kubenswrapper[4762]: I1014 13:07:09.081040 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvr26\" (UniqueName: \"kubernetes.io/projected/23dff85b-806e-49e9-8913-cf308d64e7b8-kube-api-access-wvr26\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081072 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081110 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-conf-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081235 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-k8s-cni-cncf-io\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081252 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-daemon-config\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081338 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081370 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-etc-kubernetes\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081410 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-os-release\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081439 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-conf-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-socket-dir-parent\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.081505 master-2 kubenswrapper[4762]: I1014 13:07:09.081502 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-kubelet\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081536 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-etc-kubernetes\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081542 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-multus\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081101 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-cnibin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081582 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-kubelet\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-system-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-socket-dir-parent\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081636 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-netns\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-os-release\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081687 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-multus-certs\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081677 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-netns\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081715 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-multus\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081641 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-system-cni-dir\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081753 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-cni-binary-copy\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081832 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-run-multus-certs\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.081889 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-hostroot\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.082029 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-hostroot\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.082065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-bin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.082538 master-2 kubenswrapper[4762]: I1014 13:07:09.082133 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23dff85b-806e-49e9-8913-cf308d64e7b8-host-var-lib-cni-bin\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.083532 master-2 kubenswrapper[4762]: I1014 13:07:09.082886 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-multus-daemon-config\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.083532 master-2 kubenswrapper[4762]: I1014 13:07:09.082987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23dff85b-806e-49e9-8913-cf308d64e7b8-cni-binary-copy\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.104908 master-2 kubenswrapper[4762]: I1014 13:07:09.104808 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvr26\" (UniqueName: \"kubernetes.io/projected/23dff85b-806e-49e9-8913-cf308d64e7b8-kube-api-access-wvr26\") pod \"multus-g75pn\" (UID: \"23dff85b-806e-49e9-8913-cf308d64e7b8\") " pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.183357 master-2 kubenswrapper[4762]: I1014 13:07:09.183261 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-system-cni-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183357 master-2 kubenswrapper[4762]: I1014 13:07:09.183342 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183613 master-2 kubenswrapper[4762]: I1014 13:07:09.183379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt695\" (UniqueName: \"kubernetes.io/projected/9eb3fcac-f577-42b9-823b-e05d43478814-kube-api-access-wt695\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183613 master-2 kubenswrapper[4762]: I1014 13:07:09.183423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-binary-copy\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183613 master-2 kubenswrapper[4762]: I1014 13:07:09.183484 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-os-release\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183613 master-2 kubenswrapper[4762]: I1014 13:07:09.183522 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183613 master-2 kubenswrapper[4762]: I1014 13:07:09.183558 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-whereabouts-configmap\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.183907 master-2 kubenswrapper[4762]: I1014 13:07:09.183810 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-cnibin\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.192441 master-2 kubenswrapper[4762]: I1014 13:07:09.192382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g75pn" Oct 14 13:07:09.213006 master-2 kubenswrapper[4762]: W1014 13:07:09.212912 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23dff85b_806e_49e9_8913_cf308d64e7b8.slice/crio-716e2c774e82867e7c6cac4ef1a3e15752b1a5b18208c4b37862b3d11103cb52 WatchSource:0}: Error finding container 716e2c774e82867e7c6cac4ef1a3e15752b1a5b18208c4b37862b3d11103cb52: Status 404 returned error can't find the container with id 716e2c774e82867e7c6cac4ef1a3e15752b1a5b18208c4b37862b3d11103cb52 Oct 14 13:07:09.284437 master-2 kubenswrapper[4762]: I1014 13:07:09.284350 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-os-release\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284437 master-2 kubenswrapper[4762]: I1014 13:07:09.284419 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-whereabouts-configmap\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284454 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284484 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-cnibin\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-system-cni-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284570 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt695\" (UniqueName: \"kubernetes.io/projected/9eb3fcac-f577-42b9-823b-e05d43478814-kube-api-access-wt695\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284612 master-2 kubenswrapper[4762]: I1014 13:07:09.284602 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-binary-copy\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284952 master-2 kubenswrapper[4762]: I1014 13:07:09.284705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-os-release\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284952 master-2 kubenswrapper[4762]: I1014 13:07:09.284790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-system-cni-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.284952 master-2 kubenswrapper[4762]: I1014 13:07:09.284839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-cnibin\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.285196 master-2 kubenswrapper[4762]: I1014 13:07:09.285127 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eb3fcac-f577-42b9-823b-e05d43478814-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.285694 master-2 kubenswrapper[4762]: I1014 13:07:09.285657 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-binary-copy\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.286080 master-2 kubenswrapper[4762]: I1014 13:07:09.286036 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.286474 master-2 kubenswrapper[4762]: I1014 13:07:09.286420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/9eb3fcac-f577-42b9-823b-e05d43478814-whereabouts-configmap\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.318433 master-2 kubenswrapper[4762]: I1014 13:07:09.318340 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt695\" (UniqueName: \"kubernetes.io/projected/9eb3fcac-f577-42b9-823b-e05d43478814-kube-api-access-wt695\") pod \"multus-additional-cni-plugins-mgfql\" (UID: \"9eb3fcac-f577-42b9-823b-e05d43478814\") " pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.387635 master-2 kubenswrapper[4762]: I1014 13:07:09.387522 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mgfql" Oct 14 13:07:09.401108 master-2 kubenswrapper[4762]: W1014 13:07:09.401064 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eb3fcac_f577_42b9_823b_e05d43478814.slice/crio-d81a10bdf3b3564dd5d8fc278b8c7c690d038a89f3a28ae0a15e5781ccbb59e1 WatchSource:0}: Error finding container d81a10bdf3b3564dd5d8fc278b8c7c690d038a89f3a28ae0a15e5781ccbb59e1: Status 404 returned error can't find the container with id d81a10bdf3b3564dd5d8fc278b8c7c690d038a89f3a28ae0a15e5781ccbb59e1 Oct 14 13:07:09.548539 master-2 kubenswrapper[4762]: I1014 13:07:09.548411 4762 scope.go:117] "RemoveContainer" containerID="410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98" Oct 14 13:07:09.757317 master-2 kubenswrapper[4762]: I1014 13:07:09.757223 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerStarted","Data":"d81a10bdf3b3564dd5d8fc278b8c7c690d038a89f3a28ae0a15e5781ccbb59e1"} Oct 14 13:07:09.759292 master-2 kubenswrapper[4762]: I1014 13:07:09.759183 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g75pn" event={"ID":"23dff85b-806e-49e9-8913-cf308d64e7b8","Type":"ContainerStarted","Data":"716e2c774e82867e7c6cac4ef1a3e15752b1a5b18208c4b37862b3d11103cb52"} Oct 14 13:07:09.861035 master-2 kubenswrapper[4762]: I1014 13:07:09.860956 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-b84p7"] Oct 14 13:07:09.861410 master-2 kubenswrapper[4762]: I1014 13:07:09.861382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:09.861580 master-2 kubenswrapper[4762]: E1014 13:07:09.861534 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:09.989673 master-2 kubenswrapper[4762]: I1014 13:07:09.989575 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwlzp\" (UniqueName: \"kubernetes.io/projected/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-kube-api-access-rwlzp\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:09.989673 master-2 kubenswrapper[4762]: I1014 13:07:09.989669 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:10.090957 master-2 kubenswrapper[4762]: I1014 13:07:10.090802 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:10.090957 master-2 kubenswrapper[4762]: I1014 13:07:10.090906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwlzp\" (UniqueName: \"kubernetes.io/projected/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-kube-api-access-rwlzp\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:10.091238 master-2 kubenswrapper[4762]: E1014 13:07:10.091002 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:10.091238 master-2 kubenswrapper[4762]: E1014 13:07:10.091086 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:10.591065303 +0000 UTC m=+59.835224462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:10.121826 master-2 kubenswrapper[4762]: I1014 13:07:10.121759 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwlzp\" (UniqueName: \"kubernetes.io/projected/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-kube-api-access-rwlzp\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:10.594750 master-2 kubenswrapper[4762]: I1014 13:07:10.594622 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:10.595060 master-2 kubenswrapper[4762]: E1014 13:07:10.594820 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:10.595060 master-2 kubenswrapper[4762]: E1014 13:07:10.594902 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:11.594884644 +0000 UTC m=+60.839043803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:10.762433 master-2 kubenswrapper[4762]: I1014 13:07:10.762367 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/2.log" Oct 14 13:07:10.763131 master-2 kubenswrapper[4762]: I1014 13:07:10.763087 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/1.log" Oct 14 13:07:10.764087 master-2 kubenswrapper[4762]: I1014 13:07:10.764039 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c" exitCode=1 Oct 14 13:07:10.764191 master-2 kubenswrapper[4762]: I1014 13:07:10.764081 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c"} Oct 14 13:07:10.764191 master-2 kubenswrapper[4762]: I1014 13:07:10.764132 4762 scope.go:117] "RemoveContainer" containerID="410dd88720476dba61fbda2cac848ca1a0e07989b94dd2130e457cff5f9aec98" Oct 14 13:07:10.764683 master-2 kubenswrapper[4762]: I1014 13:07:10.764638 4762 scope.go:117] "RemoveContainer" containerID="666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c" Oct 14 13:07:10.764873 master-2 kubenswrapper[4762]: E1014 13:07:10.764833 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:07:11.547922 master-2 kubenswrapper[4762]: I1014 13:07:11.547882 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:11.548680 master-2 kubenswrapper[4762]: E1014 13:07:11.548609 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:11.603473 master-2 kubenswrapper[4762]: I1014 13:07:11.603414 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:11.603659 master-2 kubenswrapper[4762]: E1014 13:07:11.603568 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:11.603701 master-2 kubenswrapper[4762]: E1014 13:07:11.603671 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:13.603621731 +0000 UTC m=+62.847780900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:11.767340 master-2 kubenswrapper[4762]: I1014 13:07:11.767249 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/2.log" Oct 14 13:07:12.773086 master-2 kubenswrapper[4762]: I1014 13:07:12.773023 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="a681d59d7791538e44736a2b148bdf9d9c6cc306b16461b10b123dc7f91fb67d" exitCode=0 Oct 14 13:07:12.773086 master-2 kubenswrapper[4762]: I1014 13:07:12.773074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"a681d59d7791538e44736a2b148bdf9d9c6cc306b16461b10b123dc7f91fb67d"} Oct 14 13:07:13.548545 master-2 kubenswrapper[4762]: I1014 13:07:13.548371 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:13.548921 master-2 kubenswrapper[4762]: E1014 13:07:13.548706 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:13.620196 master-2 kubenswrapper[4762]: I1014 13:07:13.620085 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:13.620595 master-2 kubenswrapper[4762]: E1014 13:07:13.620300 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:13.620595 master-2 kubenswrapper[4762]: E1014 13:07:13.620379 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:17.620358683 +0000 UTC m=+66.864517852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:15.548257 master-2 kubenswrapper[4762]: I1014 13:07:15.548181 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:15.548786 master-2 kubenswrapper[4762]: E1014 13:07:15.548321 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:17.547894 master-2 kubenswrapper[4762]: I1014 13:07:17.547824 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:17.548395 master-2 kubenswrapper[4762]: E1014 13:07:17.548003 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:17.650925 master-2 kubenswrapper[4762]: I1014 13:07:17.650885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:17.651133 master-2 kubenswrapper[4762]: E1014 13:07:17.651006 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:17.651133 master-2 kubenswrapper[4762]: E1014 13:07:17.651068 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:25.651049918 +0000 UTC m=+74.895209077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:19.548243 master-2 kubenswrapper[4762]: I1014 13:07:19.548149 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:19.548779 master-2 kubenswrapper[4762]: E1014 13:07:19.548399 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:21.268078 master-2 kubenswrapper[4762]: I1014 13:07:21.267974 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m"] Oct 14 13:07:21.268827 master-2 kubenswrapper[4762]: I1014 13:07:21.268418 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.271973 master-2 kubenswrapper[4762]: I1014 13:07:21.271904 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 14 13:07:21.272142 master-2 kubenswrapper[4762]: I1014 13:07:21.271988 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 14 13:07:21.272142 master-2 kubenswrapper[4762]: I1014 13:07:21.272086 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 14 13:07:21.272451 master-2 kubenswrapper[4762]: I1014 13:07:21.272367 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 14 13:07:21.272451 master-2 kubenswrapper[4762]: I1014 13:07:21.272378 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 14 13:07:21.378069 master-2 kubenswrapper[4762]: I1014 13:07:21.377945 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-env-overrides\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.378069 master-2 kubenswrapper[4762]: I1014 13:07:21.378027 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb727\" (UniqueName: \"kubernetes.io/projected/8d692cb4-5325-41b6-9058-c3d4870dee2a-kube-api-access-rb727\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.378551 master-2 kubenswrapper[4762]: I1014 13:07:21.378106 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovnkube-config\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.378551 master-2 kubenswrapper[4762]: I1014 13:07:21.378144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.479489 master-2 kubenswrapper[4762]: I1014 13:07:21.479385 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovnkube-config\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.479489 master-2 kubenswrapper[4762]: I1014 13:07:21.479469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.480039 master-2 kubenswrapper[4762]: I1014 13:07:21.479513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-env-overrides\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.480039 master-2 kubenswrapper[4762]: I1014 13:07:21.479553 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb727\" (UniqueName: \"kubernetes.io/projected/8d692cb4-5325-41b6-9058-c3d4870dee2a-kube-api-access-rb727\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.480924 master-2 kubenswrapper[4762]: I1014 13:07:21.480845 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-env-overrides\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.481316 master-2 kubenswrapper[4762]: I1014 13:07:21.481263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovnkube-config\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.484264 master-2 kubenswrapper[4762]: I1014 13:07:21.483647 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d692cb4-5325-41b6-9058-c3d4870dee2a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.492284 master-2 kubenswrapper[4762]: I1014 13:07:21.492237 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ssgb2"] Oct 14 13:07:21.493082 master-2 kubenswrapper[4762]: I1014 13:07:21.493046 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.497360 master-2 kubenswrapper[4762]: I1014 13:07:21.497324 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 14 13:07:21.497500 master-2 kubenswrapper[4762]: I1014 13:07:21.497467 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 14 13:07:21.517030 master-2 kubenswrapper[4762]: I1014 13:07:21.516960 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb727\" (UniqueName: \"kubernetes.io/projected/8d692cb4-5325-41b6-9058-c3d4870dee2a-kube-api-access-rb727\") pod \"ovnkube-control-plane-864d695c77-vbf9m\" (UID: \"8d692cb4-5325-41b6-9058-c3d4870dee2a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.547514 master-2 kubenswrapper[4762]: I1014 13:07:21.547372 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:21.548390 master-2 kubenswrapper[4762]: E1014 13:07:21.548314 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:21.548947 master-2 kubenswrapper[4762]: I1014 13:07:21.548892 4762 scope.go:117] "RemoveContainer" containerID="666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c" Oct 14 13:07:21.549274 master-2 kubenswrapper[4762]: E1014 13:07:21.549216 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:07:21.580429 master-2 kubenswrapper[4762]: I1014 13:07:21.580271 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580429 master-2 kubenswrapper[4762]: I1014 13:07:21.580370 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580429 master-2 kubenswrapper[4762]: I1014 13:07:21.580424 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580473 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlhb\" (UniqueName: \"kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580538 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580634 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580720 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.580952 master-2 kubenswrapper[4762]: I1014 13:07:21.580858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.580993 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581085 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581142 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581237 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581292 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581341 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581456 master-2 kubenswrapper[4762]: I1014 13:07:21.581403 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581990 master-2 kubenswrapper[4762]: I1014 13:07:21.581466 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.581990 master-2 kubenswrapper[4762]: I1014 13:07:21.581550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.587440 master-2 kubenswrapper[4762]: I1014 13:07:21.587377 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" Oct 14 13:07:21.682544 master-2 kubenswrapper[4762]: I1014 13:07:21.682361 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682544 master-2 kubenswrapper[4762]: I1014 13:07:21.682449 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682544 master-2 kubenswrapper[4762]: I1014 13:07:21.682491 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlhb\" (UniqueName: \"kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682544 master-2 kubenswrapper[4762]: I1014 13:07:21.682552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682645 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682711 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682743 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682843 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682874 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.682973 master-2 kubenswrapper[4762]: I1014 13:07:21.682941 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683040 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683075 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683107 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683224 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.683563 master-2 kubenswrapper[4762]: I1014 13:07:21.683275 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.684733 master-2 kubenswrapper[4762]: I1014 13:07:21.684673 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.684831 master-2 kubenswrapper[4762]: I1014 13:07:21.684803 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.684897 master-2 kubenswrapper[4762]: I1014 13:07:21.684855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.685716 master-2 kubenswrapper[4762]: I1014 13:07:21.685665 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.687297 master-2 kubenswrapper[4762]: I1014 13:07:21.686700 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.687297 master-2 kubenswrapper[4762]: I1014 13:07:21.687054 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.687297 master-2 kubenswrapper[4762]: I1014 13:07:21.687106 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.687297 master-2 kubenswrapper[4762]: I1014 13:07:21.687185 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.687953 master-2 kubenswrapper[4762]: I1014 13:07:21.687889 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.688051 master-2 kubenswrapper[4762]: I1014 13:07:21.688014 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.688134 master-2 kubenswrapper[4762]: I1014 13:07:21.688096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.688239 master-2 kubenswrapper[4762]: I1014 13:07:21.688218 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.688384 master-2 kubenswrapper[4762]: I1014 13:07:21.688288 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.688477 master-2 kubenswrapper[4762]: I1014 13:07:21.688411 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.689150 master-2 kubenswrapper[4762]: I1014 13:07:21.689083 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.689757 master-2 kubenswrapper[4762]: I1014 13:07:21.689672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.689896 master-2 kubenswrapper[4762]: I1014 13:07:21.689820 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.690018 master-2 kubenswrapper[4762]: I1014 13:07:21.689986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.693747 master-2 kubenswrapper[4762]: I1014 13:07:21.693666 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.711476 master-2 kubenswrapper[4762]: I1014 13:07:21.711371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlhb\" (UniqueName: \"kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb\") pod \"ovnkube-node-ssgb2\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:21.811954 master-2 kubenswrapper[4762]: I1014 13:07:21.811761 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:22.795746 master-2 kubenswrapper[4762]: I1014 13:07:22.795559 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g75pn" event={"ID":"23dff85b-806e-49e9-8913-cf308d64e7b8","Type":"ContainerStarted","Data":"8ace069423a3d16475d749845c5a9ed500ba8b4fb0a1c75fce073cc256819ed5"} Oct 14 13:07:22.797886 master-2 kubenswrapper[4762]: I1014 13:07:22.797826 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" event={"ID":"8d692cb4-5325-41b6-9058-c3d4870dee2a","Type":"ContainerStarted","Data":"6ba8e5f1eac7d2ab1018901fc3445ee24646780c013c147c24b0390a1e0da080"} Oct 14 13:07:22.797968 master-2 kubenswrapper[4762]: I1014 13:07:22.797884 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" event={"ID":"8d692cb4-5325-41b6-9058-c3d4870dee2a","Type":"ContainerStarted","Data":"240c186e0eaa555be21137ed742cd79004345d4bce1c3fd6fb7ecb6ac690c0c1"} Oct 14 13:07:22.799498 master-2 kubenswrapper[4762]: I1014 13:07:22.799430 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"eb986386b50de8cb09548ae4a8ec006d6181a01b015ba299f247a2ccd99a271a"} Oct 14 13:07:22.802230 master-2 kubenswrapper[4762]: I1014 13:07:22.802179 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="fbd49b43e34f082f001a1ecd8159bcfe74016efc0263dd445a338c304ef6049c" exitCode=0 Oct 14 13:07:22.802324 master-2 kubenswrapper[4762]: I1014 13:07:22.802231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"fbd49b43e34f082f001a1ecd8159bcfe74016efc0263dd445a338c304ef6049c"} Oct 14 13:07:22.812613 master-2 kubenswrapper[4762]: I1014 13:07:22.812522 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g75pn" podStartSLOduration=1.7055603000000001 podStartE2EDuration="14.81250149s" podCreationTimestamp="2025-10-14 13:07:08 +0000 UTC" firstStartedPulling="2025-10-14 13:07:09.215935881 +0000 UTC m=+58.460095080" lastFinishedPulling="2025-10-14 13:07:22.322877091 +0000 UTC m=+71.567036270" observedRunningTime="2025-10-14 13:07:22.811784087 +0000 UTC m=+72.055943316" watchObservedRunningTime="2025-10-14 13:07:22.81250149 +0000 UTC m=+72.056660689" Oct 14 13:07:23.548284 master-2 kubenswrapper[4762]: I1014 13:07:23.548240 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:23.548446 master-2 kubenswrapper[4762]: E1014 13:07:23.548390 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:24.470359 master-2 kubenswrapper[4762]: I1014 13:07:24.470317 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-cb5bh"] Oct 14 13:07:24.470745 master-2 kubenswrapper[4762]: I1014 13:07:24.470600 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:24.470745 master-2 kubenswrapper[4762]: E1014 13:07:24.470669 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:24.608829 master-2 kubenswrapper[4762]: I1014 13:07:24.608784 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:24.709735 master-2 kubenswrapper[4762]: I1014 13:07:24.709630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:24.732420 master-2 kubenswrapper[4762]: E1014 13:07:24.732342 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:24.732420 master-2 kubenswrapper[4762]: E1014 13:07:24.732382 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:24.732420 master-2 kubenswrapper[4762]: E1014 13:07:24.732397 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:24.732686 master-2 kubenswrapper[4762]: E1014 13:07:24.732468 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:25.232448534 +0000 UTC m=+74.476607703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:24.809525 master-2 kubenswrapper[4762]: I1014 13:07:24.809460 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="b8a857371fc9162c2d6f96021c36f9845fd2d1eea5c30a285d0620a88e04ebf7" exitCode=0 Oct 14 13:07:24.809525 master-2 kubenswrapper[4762]: I1014 13:07:24.809506 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"b8a857371fc9162c2d6f96021c36f9845fd2d1eea5c30a285d0620a88e04ebf7"} Oct 14 13:07:25.314359 master-2 kubenswrapper[4762]: I1014 13:07:25.314271 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:25.314545 master-2 kubenswrapper[4762]: E1014 13:07:25.314497 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:25.314545 master-2 kubenswrapper[4762]: E1014 13:07:25.314536 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:25.314632 master-2 kubenswrapper[4762]: E1014 13:07:25.314559 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:25.314665 master-2 kubenswrapper[4762]: E1014 13:07:25.314650 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:26.314618533 +0000 UTC m=+75.558777732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:25.548040 master-2 kubenswrapper[4762]: I1014 13:07:25.547969 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:25.548688 master-2 kubenswrapper[4762]: E1014 13:07:25.548266 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:25.717846 master-2 kubenswrapper[4762]: I1014 13:07:25.717776 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:25.717989 master-2 kubenswrapper[4762]: E1014 13:07:25.717964 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:25.718077 master-2 kubenswrapper[4762]: E1014 13:07:25.718030 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:41.718014605 +0000 UTC m=+90.962173774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:26.324776 master-2 kubenswrapper[4762]: I1014 13:07:26.324674 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:26.325101 master-2 kubenswrapper[4762]: E1014 13:07:26.324964 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:26.325101 master-2 kubenswrapper[4762]: E1014 13:07:26.325012 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:26.325101 master-2 kubenswrapper[4762]: E1014 13:07:26.325037 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:26.325409 master-2 kubenswrapper[4762]: E1014 13:07:26.325122 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:28.325098078 +0000 UTC m=+77.569257267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:26.548360 master-2 kubenswrapper[4762]: I1014 13:07:26.548257 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:26.549245 master-2 kubenswrapper[4762]: E1014 13:07:26.548390 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:26.817707 master-2 kubenswrapper[4762]: I1014 13:07:26.817586 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="e9af612e799b734e0700f6d113af7ac76c919922e88b4824ebf94aa7caddc777" exitCode=0 Oct 14 13:07:26.817707 master-2 kubenswrapper[4762]: I1014 13:07:26.817674 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"e9af612e799b734e0700f6d113af7ac76c919922e88b4824ebf94aa7caddc777"} Oct 14 13:07:27.066914 master-2 kubenswrapper[4762]: I1014 13:07:27.066782 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-5tzml"] Oct 14 13:07:27.067384 master-2 kubenswrapper[4762]: I1014 13:07:27.067090 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.070688 master-2 kubenswrapper[4762]: I1014 13:07:27.070419 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 14 13:07:27.070688 master-2 kubenswrapper[4762]: I1014 13:07:27.070483 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 14 13:07:27.070688 master-2 kubenswrapper[4762]: I1014 13:07:27.070483 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 14 13:07:27.070688 master-2 kubenswrapper[4762]: I1014 13:07:27.070569 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 14 13:07:27.070688 master-2 kubenswrapper[4762]: I1014 13:07:27.070591 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 14 13:07:27.233837 master-2 kubenswrapper[4762]: I1014 13:07:27.233782 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e9cbb85-261b-485e-8bd4-b4d38108c06e-webhook-cert\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.233837 master-2 kubenswrapper[4762]: I1014 13:07:27.233822 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjk6\" (UniqueName: \"kubernetes.io/projected/5e9cbb85-261b-485e-8bd4-b4d38108c06e-kube-api-access-jmjk6\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.233837 master-2 kubenswrapper[4762]: I1014 13:07:27.233845 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-env-overrides\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.234108 master-2 kubenswrapper[4762]: I1014 13:07:27.233867 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-ovnkube-identity-cm\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.335051 master-2 kubenswrapper[4762]: I1014 13:07:27.334805 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-ovnkube-identity-cm\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.335051 master-2 kubenswrapper[4762]: I1014 13:07:27.334857 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e9cbb85-261b-485e-8bd4-b4d38108c06e-webhook-cert\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.335051 master-2 kubenswrapper[4762]: I1014 13:07:27.334878 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjk6\" (UniqueName: \"kubernetes.io/projected/5e9cbb85-261b-485e-8bd4-b4d38108c06e-kube-api-access-jmjk6\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.335051 master-2 kubenswrapper[4762]: I1014 13:07:27.334895 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-env-overrides\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.335723 master-2 kubenswrapper[4762]: I1014 13:07:27.335544 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-env-overrides\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.336060 master-2 kubenswrapper[4762]: I1014 13:07:27.335985 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/5e9cbb85-261b-485e-8bd4-b4d38108c06e-ovnkube-identity-cm\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.340341 master-2 kubenswrapper[4762]: I1014 13:07:27.340306 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e9cbb85-261b-485e-8bd4-b4d38108c06e-webhook-cert\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.365931 master-2 kubenswrapper[4762]: I1014 13:07:27.365766 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjk6\" (UniqueName: \"kubernetes.io/projected/5e9cbb85-261b-485e-8bd4-b4d38108c06e-kube-api-access-jmjk6\") pod \"network-node-identity-5tzml\" (UID: \"5e9cbb85-261b-485e-8bd4-b4d38108c06e\") " pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.383407 master-2 kubenswrapper[4762]: I1014 13:07:27.383299 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-5tzml" Oct 14 13:07:27.393562 master-2 kubenswrapper[4762]: W1014 13:07:27.393468 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9cbb85_261b_485e_8bd4_b4d38108c06e.slice/crio-8f9250d0d99fe82089f3d7f0883dc8fc7aecafe8a4bb2770b20a955ab571c1de WatchSource:0}: Error finding container 8f9250d0d99fe82089f3d7f0883dc8fc7aecafe8a4bb2770b20a955ab571c1de: Status 404 returned error can't find the container with id 8f9250d0d99fe82089f3d7f0883dc8fc7aecafe8a4bb2770b20a955ab571c1de Oct 14 13:07:27.547472 master-2 kubenswrapper[4762]: I1014 13:07:27.547402 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:27.547718 master-2 kubenswrapper[4762]: E1014 13:07:27.547536 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:27.821267 master-2 kubenswrapper[4762]: I1014 13:07:27.821193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-5tzml" event={"ID":"5e9cbb85-261b-485e-8bd4-b4d38108c06e","Type":"ContainerStarted","Data":"8f9250d0d99fe82089f3d7f0883dc8fc7aecafe8a4bb2770b20a955ab571c1de"} Oct 14 13:07:28.343754 master-2 kubenswrapper[4762]: I1014 13:07:28.343691 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:28.343999 master-2 kubenswrapper[4762]: E1014 13:07:28.343950 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:28.344063 master-2 kubenswrapper[4762]: E1014 13:07:28.344005 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:28.344063 master-2 kubenswrapper[4762]: E1014 13:07:28.344027 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:28.344137 master-2 kubenswrapper[4762]: E1014 13:07:28.344115 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:32.344080694 +0000 UTC m=+81.588239893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:28.547973 master-2 kubenswrapper[4762]: I1014 13:07:28.547875 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:28.548294 master-2 kubenswrapper[4762]: E1014 13:07:28.548014 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:29.547533 master-2 kubenswrapper[4762]: I1014 13:07:29.547428 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:29.548379 master-2 kubenswrapper[4762]: E1014 13:07:29.547656 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:30.548035 master-2 kubenswrapper[4762]: I1014 13:07:30.547985 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:30.548620 master-2 kubenswrapper[4762]: E1014 13:07:30.548108 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:31.547400 master-2 kubenswrapper[4762]: I1014 13:07:31.547359 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:31.548111 master-2 kubenswrapper[4762]: E1014 13:07:31.548063 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:32.383057 master-2 kubenswrapper[4762]: I1014 13:07:32.382955 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:32.383335 master-2 kubenswrapper[4762]: E1014 13:07:32.383132 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:32.383335 master-2 kubenswrapper[4762]: E1014 13:07:32.383167 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:32.383335 master-2 kubenswrapper[4762]: E1014 13:07:32.383178 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:32.383335 master-2 kubenswrapper[4762]: E1014 13:07:32.383228 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:40.383213116 +0000 UTC m=+89.627372275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:32.548347 master-2 kubenswrapper[4762]: I1014 13:07:32.548304 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:32.548802 master-2 kubenswrapper[4762]: E1014 13:07:32.548434 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:33.547776 master-2 kubenswrapper[4762]: I1014 13:07:33.547735 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:33.547978 master-2 kubenswrapper[4762]: E1014 13:07:33.547844 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:33.548563 master-2 kubenswrapper[4762]: I1014 13:07:33.548530 4762 scope.go:117] "RemoveContainer" containerID="666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c" Oct 14 13:07:34.547440 master-2 kubenswrapper[4762]: I1014 13:07:34.547369 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:34.547648 master-2 kubenswrapper[4762]: E1014 13:07:34.547575 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:35.548114 master-2 kubenswrapper[4762]: I1014 13:07:35.548029 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:35.549037 master-2 kubenswrapper[4762]: E1014 13:07:35.548249 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:36.313747 master-2 kubenswrapper[4762]: I1014 13:07:36.313674 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:07:36.313931 master-2 kubenswrapper[4762]: E1014 13:07:36.313867 4762 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Oct 14 13:07:36.313971 master-2 kubenswrapper[4762]: E1014 13:07:36.313949 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert podName:d0bf2b14-2719-4b1b-a661-fbf4d27c05dc nodeName:}" failed. No retries permitted until 2025-10-14 13:08:40.313924232 +0000 UTC m=+149.558083431 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert") pod "cluster-version-operator-55bd67947c-872k9" (UID: "d0bf2b14-2719-4b1b-a661-fbf4d27c05dc") : secret "cluster-version-operator-serving-cert" not found Oct 14 13:07:36.547340 master-2 kubenswrapper[4762]: I1014 13:07:36.547281 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:36.547534 master-2 kubenswrapper[4762]: E1014 13:07:36.547431 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:37.549527 master-2 kubenswrapper[4762]: I1014 13:07:37.548315 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:37.549527 master-2 kubenswrapper[4762]: E1014 13:07:37.548980 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:37.845679 master-2 kubenswrapper[4762]: I1014 13:07:37.845487 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-5tzml" event={"ID":"5e9cbb85-261b-485e-8bd4-b4d38108c06e","Type":"ContainerStarted","Data":"a4626dd24ee114b1165bf3e69a213527412105025e38e6fa6a694c9aeb3ab6d5"} Oct 14 13:07:37.845679 master-2 kubenswrapper[4762]: I1014 13:07:37.845551 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-5tzml" event={"ID":"5e9cbb85-261b-485e-8bd4-b4d38108c06e","Type":"ContainerStarted","Data":"e807780a118a48fdcd5463f712848a68aec0e3e1739fc107e838a90976fd10cf"} Oct 14 13:07:37.847931 master-2 kubenswrapper[4762]: I1014 13:07:37.847874 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" exitCode=0 Oct 14 13:07:37.848081 master-2 kubenswrapper[4762]: I1014 13:07:37.847975 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} Oct 14 13:07:37.854932 master-2 kubenswrapper[4762]: I1014 13:07:37.854886 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/3.log" Oct 14 13:07:37.856232 master-2 kubenswrapper[4762]: I1014 13:07:37.856065 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/2.log" Oct 14 13:07:37.857312 master-2 kubenswrapper[4762]: I1014 13:07:37.857256 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" exitCode=1 Oct 14 13:07:37.857447 master-2 kubenswrapper[4762]: I1014 13:07:37.857331 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b"} Oct 14 13:07:37.857447 master-2 kubenswrapper[4762]: I1014 13:07:37.857400 4762 scope.go:117] "RemoveContainer" containerID="666b60df5ef43af6268df15ab66505cea6404953afaca848b40e96d57e75ac0c" Oct 14 13:07:37.858197 master-2 kubenswrapper[4762]: I1014 13:07:37.858131 4762 scope.go:117] "RemoveContainer" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" Oct 14 13:07:37.858679 master-2 kubenswrapper[4762]: E1014 13:07:37.858489 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:07:37.866210 master-2 kubenswrapper[4762]: I1014 13:07:37.865710 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-5tzml" podStartSLOduration=0.836681208 podStartE2EDuration="10.865683043s" podCreationTimestamp="2025-10-14 13:07:27 +0000 UTC" firstStartedPulling="2025-10-14 13:07:27.396378719 +0000 UTC m=+76.640537888" lastFinishedPulling="2025-10-14 13:07:37.425380554 +0000 UTC m=+86.669539723" observedRunningTime="2025-10-14 13:07:37.864827665 +0000 UTC m=+87.108986864" watchObservedRunningTime="2025-10-14 13:07:37.865683043 +0000 UTC m=+87.109842232" Oct 14 13:07:37.869336 master-2 kubenswrapper[4762]: I1014 13:07:37.869266 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="556ff1003162f9406854a024f6abb822e31b5983d016321ffbe6da6c97d4ba42" exitCode=0 Oct 14 13:07:37.869456 master-2 kubenswrapper[4762]: I1014 13:07:37.869355 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"556ff1003162f9406854a024f6abb822e31b5983d016321ffbe6da6c97d4ba42"} Oct 14 13:07:37.872456 master-2 kubenswrapper[4762]: I1014 13:07:37.872415 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" event={"ID":"8d692cb4-5325-41b6-9058-c3d4870dee2a","Type":"ContainerStarted","Data":"8768f37f7e5beffd121af2f6150a7c35e86c05b4f08f16f913afac67cb63d846"} Oct 14 13:07:37.962468 master-2 kubenswrapper[4762]: I1014 13:07:37.962375 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-864d695c77-vbf9m" podStartSLOduration=2.01210719 podStartE2EDuration="16.962352662s" podCreationTimestamp="2025-10-14 13:07:21 +0000 UTC" firstStartedPulling="2025-10-14 13:07:22.443611418 +0000 UTC m=+71.687770587" lastFinishedPulling="2025-10-14 13:07:37.3938569 +0000 UTC m=+86.638016059" observedRunningTime="2025-10-14 13:07:37.936393825 +0000 UTC m=+87.180553024" watchObservedRunningTime="2025-10-14 13:07:37.962352662 +0000 UTC m=+87.206511861" Oct 14 13:07:38.548466 master-2 kubenswrapper[4762]: I1014 13:07:38.547973 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:38.548693 master-2 kubenswrapper[4762]: E1014 13:07:38.548574 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:38.879805 master-2 kubenswrapper[4762]: I1014 13:07:38.879727 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} Oct 14 13:07:38.879805 master-2 kubenswrapper[4762]: I1014 13:07:38.879781 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:38.879805 master-2 kubenswrapper[4762]: I1014 13:07:38.879800 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:38.881946 master-2 kubenswrapper[4762]: I1014 13:07:38.881868 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/3.log" Oct 14 13:07:38.887858 master-2 kubenswrapper[4762]: I1014 13:07:38.887786 4762 generic.go:334] "Generic (PLEG): container finished" podID="9eb3fcac-f577-42b9-823b-e05d43478814" containerID="6a37728364ecbb03f1a5dd701975766f74e0db221185deabdd80a5339043ceb3" exitCode=0 Oct 14 13:07:38.887858 master-2 kubenswrapper[4762]: I1014 13:07:38.887850 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerDied","Data":"6a37728364ecbb03f1a5dd701975766f74e0db221185deabdd80a5339043ceb3"} Oct 14 13:07:39.548508 master-2 kubenswrapper[4762]: I1014 13:07:39.548427 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:39.548744 master-2 kubenswrapper[4762]: E1014 13:07:39.548605 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:39.896729 master-2 kubenswrapper[4762]: I1014 13:07:39.896640 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mgfql" event={"ID":"9eb3fcac-f577-42b9-823b-e05d43478814","Type":"ContainerStarted","Data":"1f4f19001d8346eeb9e00b7f48764fa90ca02b113922e6a6160d2c62d6f64701"} Oct 14 13:07:39.901625 master-2 kubenswrapper[4762]: I1014 13:07:39.901549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} Oct 14 13:07:39.901625 master-2 kubenswrapper[4762]: I1014 13:07:39.901615 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} Oct 14 13:07:40.457844 master-2 kubenswrapper[4762]: I1014 13:07:40.457608 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:40.457844 master-2 kubenswrapper[4762]: E1014 13:07:40.457810 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:40.457844 master-2 kubenswrapper[4762]: E1014 13:07:40.457851 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:40.458519 master-2 kubenswrapper[4762]: E1014 13:07:40.457874 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:40.458519 master-2 kubenswrapper[4762]: E1014 13:07:40.457964 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:07:56.457937835 +0000 UTC m=+105.702097034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:40.548531 master-2 kubenswrapper[4762]: I1014 13:07:40.548425 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:40.548804 master-2 kubenswrapper[4762]: E1014 13:07:40.548649 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:40.910464 master-2 kubenswrapper[4762]: I1014 13:07:40.909489 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} Oct 14 13:07:40.933950 master-2 kubenswrapper[4762]: I1014 13:07:40.933674 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mgfql" podStartSLOduration=4.051879979 podStartE2EDuration="31.933645321s" podCreationTimestamp="2025-10-14 13:07:09 +0000 UTC" firstStartedPulling="2025-10-14 13:07:09.406601107 +0000 UTC m=+58.650760266" lastFinishedPulling="2025-10-14 13:07:37.288366429 +0000 UTC m=+86.532525608" observedRunningTime="2025-10-14 13:07:40.933274669 +0000 UTC m=+90.177433888" watchObservedRunningTime="2025-10-14 13:07:40.933645321 +0000 UTC m=+90.177804530" Oct 14 13:07:41.548210 master-2 kubenswrapper[4762]: I1014 13:07:41.548112 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:41.549714 master-2 kubenswrapper[4762]: E1014 13:07:41.549629 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:41.770773 master-2 kubenswrapper[4762]: I1014 13:07:41.770686 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:41.771031 master-2 kubenswrapper[4762]: E1014 13:07:41.770883 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:41.771031 master-2 kubenswrapper[4762]: E1014 13:07:41.770991 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.770959079 +0000 UTC m=+123.015118268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 14 13:07:42.547523 master-2 kubenswrapper[4762]: I1014 13:07:42.547417 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:42.548722 master-2 kubenswrapper[4762]: E1014 13:07:42.547619 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:43.548109 master-2 kubenswrapper[4762]: I1014 13:07:43.547971 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:43.548109 master-2 kubenswrapper[4762]: E1014 13:07:43.548098 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:43.920885 master-2 kubenswrapper[4762]: I1014 13:07:43.920814 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} Oct 14 13:07:44.547786 master-2 kubenswrapper[4762]: I1014 13:07:44.547658 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:44.548076 master-2 kubenswrapper[4762]: E1014 13:07:44.547814 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:45.547989 master-2 kubenswrapper[4762]: I1014 13:07:45.547927 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:45.548490 master-2 kubenswrapper[4762]: E1014 13:07:45.548090 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:45.934100 master-2 kubenswrapper[4762]: I1014 13:07:45.933580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerStarted","Data":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} Oct 14 13:07:45.934100 master-2 kubenswrapper[4762]: I1014 13:07:45.934084 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:45.934100 master-2 kubenswrapper[4762]: I1014 13:07:45.934121 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:45.934100 master-2 kubenswrapper[4762]: I1014 13:07:45.934146 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:45.961111 master-2 kubenswrapper[4762]: I1014 13:07:45.960997 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podStartSLOduration=9.842301494 podStartE2EDuration="24.960973836s" podCreationTimestamp="2025-10-14 13:07:21 +0000 UTC" firstStartedPulling="2025-10-14 13:07:22.252465314 +0000 UTC m=+71.496624513" lastFinishedPulling="2025-10-14 13:07:37.371137686 +0000 UTC m=+86.615296855" observedRunningTime="2025-10-14 13:07:45.959920123 +0000 UTC m=+95.204079352" watchObservedRunningTime="2025-10-14 13:07:45.960973836 +0000 UTC m=+95.205133025" Oct 14 13:07:46.547484 master-2 kubenswrapper[4762]: I1014 13:07:46.547356 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:46.547773 master-2 kubenswrapper[4762]: E1014 13:07:46.547553 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:46.911801 master-2 kubenswrapper[4762]: I1014 13:07:46.911680 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ssgb2"] Oct 14 13:07:47.547918 master-2 kubenswrapper[4762]: I1014 13:07:47.547751 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:47.548250 master-2 kubenswrapper[4762]: E1014 13:07:47.548124 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:47.857206 master-2 kubenswrapper[4762]: I1014 13:07:47.857108 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b84p7"] Oct 14 13:07:47.859865 master-2 kubenswrapper[4762]: I1014 13:07:47.859795 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cb5bh"] Oct 14 13:07:47.859865 master-2 kubenswrapper[4762]: I1014 13:07:47.859874 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:47.860145 master-2 kubenswrapper[4762]: E1014 13:07:47.859971 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:47.939585 master-2 kubenswrapper[4762]: I1014 13:07:47.939531 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:47.940495 master-2 kubenswrapper[4762]: E1014 13:07:47.939766 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:47.940567 master-2 kubenswrapper[4762]: I1014 13:07:47.940513 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-controller" containerID="cri-o://ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" gracePeriod=30 Oct 14 13:07:47.940695 master-2 kubenswrapper[4762]: I1014 13:07:47.940639 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="northd" containerID="cri-o://8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" gracePeriod=30 Oct 14 13:07:47.940791 master-2 kubenswrapper[4762]: I1014 13:07:47.940732 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" gracePeriod=30 Oct 14 13:07:47.940858 master-2 kubenswrapper[4762]: I1014 13:07:47.940809 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-node" containerID="cri-o://8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" gracePeriod=30 Oct 14 13:07:47.940858 master-2 kubenswrapper[4762]: I1014 13:07:47.940849 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="sbdb" containerID="cri-o://0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" gracePeriod=30 Oct 14 13:07:47.940963 master-2 kubenswrapper[4762]: I1014 13:07:47.940511 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="nbdb" containerID="cri-o://b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" gracePeriod=30 Oct 14 13:07:47.940963 master-2 kubenswrapper[4762]: I1014 13:07:47.940930 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-acl-logging" containerID="cri-o://1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" gracePeriod=30 Oct 14 13:07:47.968182 master-2 kubenswrapper[4762]: I1014 13:07:47.968018 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovnkube-controller" containerID="cri-o://19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" gracePeriod=30 Oct 14 13:07:48.422728 master-2 kubenswrapper[4762]: I1014 13:07:48.422670 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovnkube-controller/0.log" Oct 14 13:07:48.425777 master-2 kubenswrapper[4762]: I1014 13:07:48.425718 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/kube-rbac-proxy-ovn-metrics/0.log" Oct 14 13:07:48.426494 master-2 kubenswrapper[4762]: I1014 13:07:48.426448 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/kube-rbac-proxy-node/0.log" Oct 14 13:07:48.427137 master-2 kubenswrapper[4762]: I1014 13:07:48.427090 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovn-acl-logging/0.log" Oct 14 13:07:48.427868 master-2 kubenswrapper[4762]: I1014 13:07:48.427820 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovn-controller/0.log" Oct 14 13:07:48.428493 master-2 kubenswrapper[4762]: I1014 13:07:48.428448 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:48.486462 master-2 kubenswrapper[4762]: I1014 13:07:48.486368 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cthp"] Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: E1014 13:07:48.486558 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovnkube-controller" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: I1014 13:07:48.486583 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovnkube-controller" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: E1014 13:07:48.486602 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-node" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: I1014 13:07:48.486618 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-node" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: E1014 13:07:48.486634 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-controller" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: I1014 13:07:48.486653 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-controller" Oct 14 13:07:48.486667 master-2 kubenswrapper[4762]: E1014 13:07:48.486673 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-acl-logging" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486692 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-acl-logging" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: E1014 13:07:48.486709 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="northd" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486723 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="northd" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: E1014 13:07:48.486743 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486760 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: E1014 13:07:48.486778 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="sbdb" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486793 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="sbdb" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: E1014 13:07:48.486810 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="nbdb" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486825 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="nbdb" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: E1014 13:07:48.486842 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kubecfg-setup" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486859 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kubecfg-setup" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486937 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="northd" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486958 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-node" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486974 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="sbdb" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.486991 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-controller" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.487007 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovn-acl-logging" Oct 14 13:07:48.487009 master-2 kubenswrapper[4762]: I1014 13:07:48.487025 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="kube-rbac-proxy-ovn-metrics" Oct 14 13:07:48.487853 master-2 kubenswrapper[4762]: I1014 13:07:48.487045 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="nbdb" Oct 14 13:07:48.487853 master-2 kubenswrapper[4762]: I1014 13:07:48.487060 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerName="ovnkube-controller" Oct 14 13:07:48.491337 master-2 kubenswrapper[4762]: I1014 13:07:48.490460 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.532277 master-2 kubenswrapper[4762]: I1014 13:07:48.532129 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532277 master-2 kubenswrapper[4762]: I1014 13:07:48.532226 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532277 master-2 kubenswrapper[4762]: I1014 13:07:48.532258 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532298 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532334 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532391 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532427 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwlhb\" (UniqueName: \"kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532456 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532499 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532528 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532560 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532619 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532687 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532723 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532754 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.532766 master-2 kubenswrapper[4762]: I1014 13:07:48.532781 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.532811 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.532845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes\") pod \"fc445c69-944a-42d6-bb2e-53b0a745f970\" (UID: \"fc445c69-944a-42d6-bb2e-53b0a745f970\") " Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533021 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533074 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log" (OuterVolumeSpecName: "node-log") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533107 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533142 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533527 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533620 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533588 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533673 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533739 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533757 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash" (OuterVolumeSpecName: "host-slash") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533756 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket" (OuterVolumeSpecName: "log-socket") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.534502 master-2 kubenswrapper[4762]: I1014 13:07:48.533801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.535613 master-2 kubenswrapper[4762]: I1014 13:07:48.533782 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.535613 master-2 kubenswrapper[4762]: I1014 13:07:48.534244 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:07:48.535613 master-2 kubenswrapper[4762]: I1014 13:07:48.534276 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:07:48.535613 master-2 kubenswrapper[4762]: I1014 13:07:48.534385 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:07:48.540186 master-2 kubenswrapper[4762]: I1014 13:07:48.540115 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:07:48.540261 master-2 kubenswrapper[4762]: I1014 13:07:48.540113 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb" (OuterVolumeSpecName: "kube-api-access-zwlhb") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "kube-api-access-zwlhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:07:48.543269 master-2 kubenswrapper[4762]: I1014 13:07:48.543220 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fc445c69-944a-42d6-bb2e-53b0a745f970" (UID: "fc445c69-944a-42d6-bb2e-53b0a745f970"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:07:48.635102 master-2 kubenswrapper[4762]: I1014 13:07:48.634897 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.635102 master-2 kubenswrapper[4762]: I1014 13:07:48.634969 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-bin\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.635102 master-2 kubenswrapper[4762]: I1014 13:07:48.635006 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzp5c\" (UniqueName: \"kubernetes.io/projected/5a8ea9f0-8c47-4230-82cc-19ba4debe407-kube-api-access-dzp5c\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.635102 master-2 kubenswrapper[4762]: I1014 13:07:48.635038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-systemd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.635102 master-2 kubenswrapper[4762]: I1014 13:07:48.635069 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-log-socket\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635184 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-netd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635233 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-systemd-units\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635257 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-var-lib-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635303 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-ovn\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635326 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635373 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-netns\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635395 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-config\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635419 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-kubelet\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovn-node-metrics-cert\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635462 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-script-lib\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-env-overrides\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635527 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-node-log\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-slash\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635570 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-etc-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635718 4762 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-slash\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.636401 master-2 kubenswrapper[4762]: I1014 13:07:48.635761 4762 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-env-overrides\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635783 4762 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-kubelet\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635804 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635824 4762 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-ovn\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635845 4762 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc445c69-944a-42d6-bb2e-53b0a745f970-ovnkube-script-lib\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635868 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-netns\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635887 4762 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-var-lib-openvswitch\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635905 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-bin\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635924 4762 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-run-ovn-kubernetes\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635945 4762 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-node-log\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635962 4762 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-cni-netd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635981 4762 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-etc-openvswitch\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.635998 4762 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc445c69-944a-42d6-bb2e-53b0a745f970-ovn-node-metrics-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636017 4762 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-systemd-units\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636035 4762 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-systemd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636053 4762 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-log-socket\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636072 4762 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636093 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwlhb\" (UniqueName: \"kubernetes.io/projected/fc445c69-944a-42d6-bb2e-53b0a745f970-kube-api-access-zwlhb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.637522 master-2 kubenswrapper[4762]: I1014 13:07:48.636113 4762 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc445c69-944a-42d6-bb2e-53b0a745f970-run-openvswitch\") on node \"master-2\" DevicePath \"\"" Oct 14 13:07:48.737179 master-2 kubenswrapper[4762]: I1014 13:07:48.737030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-kubelet\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.737179 master-2 kubenswrapper[4762]: I1014 13:07:48.737111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovn-node-metrics-cert\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.737179 master-2 kubenswrapper[4762]: I1014 13:07:48.737143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-script-lib\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737203 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-node-log\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737241 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-env-overrides\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737311 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-slash\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737348 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737353 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-node-log\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-etc-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-bin\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737493 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-slash\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737504 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzp5c\" (UniqueName: \"kubernetes.io/projected/5a8ea9f0-8c47-4230-82cc-19ba4debe407-kube-api-access-dzp5c\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737447 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-etc-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-systemd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-systemd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-log-socket\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737598 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-bin\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.738261 master-2 kubenswrapper[4762]: I1014 13:07:48.737203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-kubelet\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737611 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-netd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737716 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-log-socket\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-cni-netd\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737748 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-systemd-units\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737805 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-var-lib-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737859 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-ovn\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737869 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-var-lib-openvswitch\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-systemd-units\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737947 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-netns\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737964 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-run-ovn\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.737979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-config\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.738024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.738050 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5a8ea9f0-8c47-4230-82cc-19ba4debe407-host-run-netns\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.739016 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-env-overrides\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.739285 master-2 kubenswrapper[4762]: I1014 13:07:48.739076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-script-lib\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.740140 master-2 kubenswrapper[4762]: I1014 13:07:48.739377 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovnkube-config\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.742111 master-2 kubenswrapper[4762]: I1014 13:07:48.742052 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a8ea9f0-8c47-4230-82cc-19ba4debe407-ovn-node-metrics-cert\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.760728 master-2 kubenswrapper[4762]: I1014 13:07:48.760630 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzp5c\" (UniqueName: \"kubernetes.io/projected/5a8ea9f0-8c47-4230-82cc-19ba4debe407-kube-api-access-dzp5c\") pod \"ovnkube-node-4cthp\" (UID: \"5a8ea9f0-8c47-4230-82cc-19ba4debe407\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.815712 master-2 kubenswrapper[4762]: I1014 13:07:48.815651 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:48.832415 master-2 kubenswrapper[4762]: W1014 13:07:48.832349 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8ea9f0_8c47_4230_82cc_19ba4debe407.slice/crio-de71d451a643a1935853e51e072b1a0090869709ed9d7f7aff783551d3d12644 WatchSource:0}: Error finding container de71d451a643a1935853e51e072b1a0090869709ed9d7f7aff783551d3d12644: Status 404 returned error can't find the container with id de71d451a643a1935853e51e072b1a0090869709ed9d7f7aff783551d3d12644 Oct 14 13:07:48.944778 master-2 kubenswrapper[4762]: I1014 13:07:48.944712 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"8a36ec8922ed65ff7893e21c623cbabeaeb91298bdbac53ce911a00db43bf79e"} Oct 14 13:07:48.944778 master-2 kubenswrapper[4762]: I1014 13:07:48.944783 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"de71d451a643a1935853e51e072b1a0090869709ed9d7f7aff783551d3d12644"} Oct 14 13:07:48.947363 master-2 kubenswrapper[4762]: I1014 13:07:48.947297 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovnkube-controller/0.log" Oct 14 13:07:48.949846 master-2 kubenswrapper[4762]: I1014 13:07:48.949797 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/kube-rbac-proxy-ovn-metrics/0.log" Oct 14 13:07:48.950684 master-2 kubenswrapper[4762]: I1014 13:07:48.950636 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/kube-rbac-proxy-node/0.log" Oct 14 13:07:48.951451 master-2 kubenswrapper[4762]: I1014 13:07:48.951396 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovn-acl-logging/0.log" Oct 14 13:07:48.952333 master-2 kubenswrapper[4762]: I1014 13:07:48.952280 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ssgb2_fc445c69-944a-42d6-bb2e-53b0a745f970/ovn-controller/0.log" Oct 14 13:07:48.952981 master-2 kubenswrapper[4762]: I1014 13:07:48.952927 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" exitCode=1 Oct 14 13:07:48.952981 master-2 kubenswrapper[4762]: I1014 13:07:48.952968 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" exitCode=0 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.952985 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" exitCode=0 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953000 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" exitCode=0 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953015 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" exitCode=143 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953034 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" exitCode=143 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953047 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" exitCode=143 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953048 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953117 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953064 4762 generic.go:334] "Generic (PLEG): container finished" podID="fc445c69-944a-42d6-bb2e-53b0a745f970" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" exitCode=143 Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953216 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953248 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:48.953263 master-2 kubenswrapper[4762]: I1014 13:07:48.953255 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953285 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953047 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953314 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953449 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953472 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953484 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953508 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953529 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953543 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953553 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953564 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953575 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953586 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953596 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953607 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953617 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953632 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953648 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953662 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953673 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953684 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953695 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953709 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953721 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953733 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953743 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} Oct 14 13:07:48.953933 master-2 kubenswrapper[4762]: I1014 13:07:48.953757 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ssgb2" event={"ID":"fc445c69-944a-42d6-bb2e-53b0a745f970","Type":"ContainerDied","Data":"eb986386b50de8cb09548ae4a8ec006d6181a01b015ba299f247a2ccd99a271a"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953772 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953785 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953796 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953809 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953820 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953831 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953842 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953852 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} Oct 14 13:07:48.955628 master-2 kubenswrapper[4762]: I1014 13:07:48.953863 4762 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} Oct 14 13:07:49.032674 master-2 kubenswrapper[4762]: I1014 13:07:49.032618 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.047474 master-2 kubenswrapper[4762]: I1014 13:07:49.047410 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ssgb2"] Oct 14 13:07:49.050500 master-2 kubenswrapper[4762]: I1014 13:07:49.050446 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ssgb2"] Oct 14 13:07:49.053299 master-2 kubenswrapper[4762]: I1014 13:07:49.053211 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.082415 master-2 kubenswrapper[4762]: I1014 13:07:49.082289 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.103659 master-2 kubenswrapper[4762]: I1014 13:07:49.103595 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.118352 master-2 kubenswrapper[4762]: I1014 13:07:49.118294 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.129942 master-2 kubenswrapper[4762]: I1014 13:07:49.129904 4762 scope.go:117] "RemoveContainer" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.142356 master-2 kubenswrapper[4762]: I1014 13:07:49.142332 4762 scope.go:117] "RemoveContainer" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.157612 master-2 kubenswrapper[4762]: I1014 13:07:49.157574 4762 scope.go:117] "RemoveContainer" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.172291 master-2 kubenswrapper[4762]: I1014 13:07:49.172246 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.172858 master-2 kubenswrapper[4762]: E1014 13:07:49.172811 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.172903 master-2 kubenswrapper[4762]: I1014 13:07:49.172866 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} err="failed to get container status \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" Oct 14 13:07:49.172935 master-2 kubenswrapper[4762]: I1014 13:07:49.172902 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.173641 master-2 kubenswrapper[4762]: E1014 13:07:49.173617 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.173671 master-2 kubenswrapper[4762]: I1014 13:07:49.173642 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} err="failed to get container status \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" Oct 14 13:07:49.173671 master-2 kubenswrapper[4762]: I1014 13:07:49.173660 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.174173 master-2 kubenswrapper[4762]: E1014 13:07:49.174131 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.174213 master-2 kubenswrapper[4762]: I1014 13:07:49.174172 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} err="failed to get container status \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" Oct 14 13:07:49.174213 master-2 kubenswrapper[4762]: I1014 13:07:49.174190 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.174609 master-2 kubenswrapper[4762]: E1014 13:07:49.174555 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.174660 master-2 kubenswrapper[4762]: I1014 13:07:49.174616 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} err="failed to get container status \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" Oct 14 13:07:49.174660 master-2 kubenswrapper[4762]: I1014 13:07:49.174655 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.175101 master-2 kubenswrapper[4762]: E1014 13:07:49.175069 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.175130 master-2 kubenswrapper[4762]: I1014 13:07:49.175101 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} err="failed to get container status \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" Oct 14 13:07:49.175130 master-2 kubenswrapper[4762]: I1014 13:07:49.175118 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.175507 master-2 kubenswrapper[4762]: E1014 13:07:49.175460 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.175563 master-2 kubenswrapper[4762]: I1014 13:07:49.175506 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} err="failed to get container status \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" Oct 14 13:07:49.175563 master-2 kubenswrapper[4762]: I1014 13:07:49.175537 4762 scope.go:117] "RemoveContainer" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.175958 master-2 kubenswrapper[4762]: E1014 13:07:49.175914 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": container with ID starting with 1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965 not found: ID does not exist" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.175958 master-2 kubenswrapper[4762]: I1014 13:07:49.175942 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} err="failed to get container status \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": rpc error: code = NotFound desc = could not find container \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": container with ID starting with 1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965 not found: ID does not exist" Oct 14 13:07:49.176097 master-2 kubenswrapper[4762]: I1014 13:07:49.175964 4762 scope.go:117] "RemoveContainer" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.176512 master-2 kubenswrapper[4762]: E1014 13:07:49.176446 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": container with ID starting with ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e not found: ID does not exist" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.176512 master-2 kubenswrapper[4762]: I1014 13:07:49.176474 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} err="failed to get container status \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": rpc error: code = NotFound desc = could not find container \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": container with ID starting with ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e not found: ID does not exist" Oct 14 13:07:49.176512 master-2 kubenswrapper[4762]: I1014 13:07:49.176491 4762 scope.go:117] "RemoveContainer" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.176871 master-2 kubenswrapper[4762]: E1014 13:07:49.176811 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": container with ID starting with 6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b not found: ID does not exist" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.176955 master-2 kubenswrapper[4762]: I1014 13:07:49.176869 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} err="failed to get container status \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": rpc error: code = NotFound desc = could not find container \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": container with ID starting with 6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b not found: ID does not exist" Oct 14 13:07:49.176955 master-2 kubenswrapper[4762]: I1014 13:07:49.176904 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.177320 master-2 kubenswrapper[4762]: I1014 13:07:49.177271 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} err="failed to get container status \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" Oct 14 13:07:49.177320 master-2 kubenswrapper[4762]: I1014 13:07:49.177298 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.177757 master-2 kubenswrapper[4762]: I1014 13:07:49.177715 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} err="failed to get container status \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" Oct 14 13:07:49.177757 master-2 kubenswrapper[4762]: I1014 13:07:49.177738 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.178068 master-2 kubenswrapper[4762]: I1014 13:07:49.178011 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} err="failed to get container status \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" Oct 14 13:07:49.178068 master-2 kubenswrapper[4762]: I1014 13:07:49.178054 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.178537 master-2 kubenswrapper[4762]: I1014 13:07:49.178482 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} err="failed to get container status \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" Oct 14 13:07:49.178537 master-2 kubenswrapper[4762]: I1014 13:07:49.178508 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.178858 master-2 kubenswrapper[4762]: I1014 13:07:49.178787 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} err="failed to get container status \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" Oct 14 13:07:49.178858 master-2 kubenswrapper[4762]: I1014 13:07:49.178834 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.179275 master-2 kubenswrapper[4762]: I1014 13:07:49.179234 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} err="failed to get container status \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" Oct 14 13:07:49.179275 master-2 kubenswrapper[4762]: I1014 13:07:49.179260 4762 scope.go:117] "RemoveContainer" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.179750 master-2 kubenswrapper[4762]: I1014 13:07:49.179684 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} err="failed to get container status \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": rpc error: code = NotFound desc = could not find container \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": container with ID starting with 1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965 not found: ID does not exist" Oct 14 13:07:49.179750 master-2 kubenswrapper[4762]: I1014 13:07:49.179725 4762 scope.go:117] "RemoveContainer" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.180083 master-2 kubenswrapper[4762]: I1014 13:07:49.180039 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} err="failed to get container status \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": rpc error: code = NotFound desc = could not find container \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": container with ID starting with ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e not found: ID does not exist" Oct 14 13:07:49.180083 master-2 kubenswrapper[4762]: I1014 13:07:49.180065 4762 scope.go:117] "RemoveContainer" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.180493 master-2 kubenswrapper[4762]: I1014 13:07:49.180421 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} err="failed to get container status \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": rpc error: code = NotFound desc = could not find container \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": container with ID starting with 6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b not found: ID does not exist" Oct 14 13:07:49.180493 master-2 kubenswrapper[4762]: I1014 13:07:49.180457 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.180841 master-2 kubenswrapper[4762]: I1014 13:07:49.180771 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} err="failed to get container status \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" Oct 14 13:07:49.180841 master-2 kubenswrapper[4762]: I1014 13:07:49.180819 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.181218 master-2 kubenswrapper[4762]: I1014 13:07:49.181185 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} err="failed to get container status \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" Oct 14 13:07:49.181218 master-2 kubenswrapper[4762]: I1014 13:07:49.181213 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.181648 master-2 kubenswrapper[4762]: I1014 13:07:49.181581 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} err="failed to get container status \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" Oct 14 13:07:49.181648 master-2 kubenswrapper[4762]: I1014 13:07:49.181625 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.181985 master-2 kubenswrapper[4762]: I1014 13:07:49.181943 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} err="failed to get container status \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" Oct 14 13:07:49.181985 master-2 kubenswrapper[4762]: I1014 13:07:49.181969 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.182546 master-2 kubenswrapper[4762]: I1014 13:07:49.182471 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} err="failed to get container status \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" Oct 14 13:07:49.182546 master-2 kubenswrapper[4762]: I1014 13:07:49.182514 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.182927 master-2 kubenswrapper[4762]: I1014 13:07:49.182861 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} err="failed to get container status \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" Oct 14 13:07:49.182927 master-2 kubenswrapper[4762]: I1014 13:07:49.182898 4762 scope.go:117] "RemoveContainer" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.183316 master-2 kubenswrapper[4762]: I1014 13:07:49.183232 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} err="failed to get container status \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": rpc error: code = NotFound desc = could not find container \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": container with ID starting with 1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965 not found: ID does not exist" Oct 14 13:07:49.183316 master-2 kubenswrapper[4762]: I1014 13:07:49.183293 4762 scope.go:117] "RemoveContainer" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.183758 master-2 kubenswrapper[4762]: I1014 13:07:49.183683 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} err="failed to get container status \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": rpc error: code = NotFound desc = could not find container \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": container with ID starting with ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e not found: ID does not exist" Oct 14 13:07:49.183758 master-2 kubenswrapper[4762]: I1014 13:07:49.183727 4762 scope.go:117] "RemoveContainer" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.184111 master-2 kubenswrapper[4762]: I1014 13:07:49.184063 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} err="failed to get container status \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": rpc error: code = NotFound desc = could not find container \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": container with ID starting with 6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b not found: ID does not exist" Oct 14 13:07:49.184111 master-2 kubenswrapper[4762]: I1014 13:07:49.184094 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.184662 master-2 kubenswrapper[4762]: I1014 13:07:49.184590 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} err="failed to get container status \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" Oct 14 13:07:49.184662 master-2 kubenswrapper[4762]: I1014 13:07:49.184647 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.185118 master-2 kubenswrapper[4762]: I1014 13:07:49.185043 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} err="failed to get container status \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" Oct 14 13:07:49.185118 master-2 kubenswrapper[4762]: I1014 13:07:49.185102 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.185561 master-2 kubenswrapper[4762]: I1014 13:07:49.185489 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} err="failed to get container status \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" Oct 14 13:07:49.185561 master-2 kubenswrapper[4762]: I1014 13:07:49.185538 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.185992 master-2 kubenswrapper[4762]: I1014 13:07:49.185928 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} err="failed to get container status \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" Oct 14 13:07:49.185992 master-2 kubenswrapper[4762]: I1014 13:07:49.185977 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.186397 master-2 kubenswrapper[4762]: I1014 13:07:49.186328 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} err="failed to get container status \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" Oct 14 13:07:49.186397 master-2 kubenswrapper[4762]: I1014 13:07:49.186382 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.186896 master-2 kubenswrapper[4762]: I1014 13:07:49.186834 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} err="failed to get container status \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" Oct 14 13:07:49.186896 master-2 kubenswrapper[4762]: I1014 13:07:49.186860 4762 scope.go:117] "RemoveContainer" containerID="1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965" Oct 14 13:07:49.187247 master-2 kubenswrapper[4762]: I1014 13:07:49.187190 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965"} err="failed to get container status \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": rpc error: code = NotFound desc = could not find container \"1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965\": container with ID starting with 1f31afa10aebb19b765ab8b9786df7e425e6f6bd93533589919f088518d31965 not found: ID does not exist" Oct 14 13:07:49.187247 master-2 kubenswrapper[4762]: I1014 13:07:49.187236 4762 scope.go:117] "RemoveContainer" containerID="ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e" Oct 14 13:07:49.187635 master-2 kubenswrapper[4762]: I1014 13:07:49.187595 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e"} err="failed to get container status \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": rpc error: code = NotFound desc = could not find container \"ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e\": container with ID starting with ce2c76c5640bb76d593c7df356e2b23b40878a3d7b2b519b45f950a43efa105e not found: ID does not exist" Oct 14 13:07:49.187635 master-2 kubenswrapper[4762]: I1014 13:07:49.187620 4762 scope.go:117] "RemoveContainer" containerID="6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b" Oct 14 13:07:49.187994 master-2 kubenswrapper[4762]: I1014 13:07:49.187916 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b"} err="failed to get container status \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": rpc error: code = NotFound desc = could not find container \"6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b\": container with ID starting with 6e7af2c749207c088256f7e750c8e09235c4eff12577bc43829f252b95f2884b not found: ID does not exist" Oct 14 13:07:49.187994 master-2 kubenswrapper[4762]: I1014 13:07:49.187977 4762 scope.go:117] "RemoveContainer" containerID="19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99" Oct 14 13:07:49.188355 master-2 kubenswrapper[4762]: I1014 13:07:49.188315 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99"} err="failed to get container status \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": rpc error: code = NotFound desc = could not find container \"19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99\": container with ID starting with 19bf49b88afafea73feb8682c0fb99839af8e541f1c651b1f0d6224489eeaf99 not found: ID does not exist" Oct 14 13:07:49.188355 master-2 kubenswrapper[4762]: I1014 13:07:49.188342 4762 scope.go:117] "RemoveContainer" containerID="0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15" Oct 14 13:07:49.188748 master-2 kubenswrapper[4762]: I1014 13:07:49.188675 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15"} err="failed to get container status \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": rpc error: code = NotFound desc = could not find container \"0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15\": container with ID starting with 0d78d6095afabe0438eca485582d3a5e5643e0961cac715e7355bf90d498fd15 not found: ID does not exist" Oct 14 13:07:49.188748 master-2 kubenswrapper[4762]: I1014 13:07:49.188726 4762 scope.go:117] "RemoveContainer" containerID="b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb" Oct 14 13:07:49.189109 master-2 kubenswrapper[4762]: I1014 13:07:49.189035 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb"} err="failed to get container status \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": rpc error: code = NotFound desc = could not find container \"b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb\": container with ID starting with b9ae4c339e6d1505c7bff17d665c0ca9dfc9a584e38051299b439c39e3d425cb not found: ID does not exist" Oct 14 13:07:49.189109 master-2 kubenswrapper[4762]: I1014 13:07:49.189087 4762 scope.go:117] "RemoveContainer" containerID="8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e" Oct 14 13:07:49.189600 master-2 kubenswrapper[4762]: I1014 13:07:49.189540 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e"} err="failed to get container status \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": rpc error: code = NotFound desc = could not find container \"8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e\": container with ID starting with 8e369df7e7794fdb51cf6375c3ca8887e6897a156455d88b64cc25eab02dff7e not found: ID does not exist" Oct 14 13:07:49.189600 master-2 kubenswrapper[4762]: I1014 13:07:49.189583 4762 scope.go:117] "RemoveContainer" containerID="717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608" Oct 14 13:07:49.189977 master-2 kubenswrapper[4762]: I1014 13:07:49.189917 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608"} err="failed to get container status \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": rpc error: code = NotFound desc = could not find container \"717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608\": container with ID starting with 717d972782394b130c70b16d840d826c8f3a1040ed136832c20d832786f95608 not found: ID does not exist" Oct 14 13:07:49.189977 master-2 kubenswrapper[4762]: I1014 13:07:49.189962 4762 scope.go:117] "RemoveContainer" containerID="8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd" Oct 14 13:07:49.190478 master-2 kubenswrapper[4762]: I1014 13:07:49.190437 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd"} err="failed to get container status \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": rpc error: code = NotFound desc = could not find container \"8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd\": container with ID starting with 8aec5d6fee6b63deedf57e35507390f40ef5c3c9f49c67be73dbebe3b090e9fd not found: ID does not exist" Oct 14 13:07:49.548075 master-2 kubenswrapper[4762]: I1014 13:07:49.547971 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:49.548425 master-2 kubenswrapper[4762]: I1014 13:07:49.548080 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:49.548425 master-2 kubenswrapper[4762]: E1014 13:07:49.548223 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:49.548425 master-2 kubenswrapper[4762]: E1014 13:07:49.548316 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:49.554263 master-2 kubenswrapper[4762]: I1014 13:07:49.554208 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc445c69-944a-42d6-bb2e-53b0a745f970" path="/var/lib/kubelet/pods/fc445c69-944a-42d6-bb2e-53b0a745f970/volumes" Oct 14 13:07:49.959524 master-2 kubenswrapper[4762]: I1014 13:07:49.959414 4762 generic.go:334] "Generic (PLEG): container finished" podID="5a8ea9f0-8c47-4230-82cc-19ba4debe407" containerID="8a36ec8922ed65ff7893e21c623cbabeaeb91298bdbac53ce911a00db43bf79e" exitCode=0 Oct 14 13:07:49.959524 master-2 kubenswrapper[4762]: I1014 13:07:49.959488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerDied","Data":"8a36ec8922ed65ff7893e21c623cbabeaeb91298bdbac53ce911a00db43bf79e"} Oct 14 13:07:50.972333 master-2 kubenswrapper[4762]: I1014 13:07:50.971891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"e85110fb5a1f7fa8d158eb39217dd32f8e8b74a042bc7e7119d92e08d24bcd30"} Oct 14 13:07:50.972333 master-2 kubenswrapper[4762]: I1014 13:07:50.972301 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"fd927bc71581c60c38a4c18ca7ab0b045c31f6a5345d29caae7a65efeff1bcf3"} Oct 14 13:07:50.972333 master-2 kubenswrapper[4762]: I1014 13:07:50.972324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"51f4c65d2e20b29d0c1b01b7ad5288aff512aacba24577d73d950851ed2e8c32"} Oct 14 13:07:50.972333 master-2 kubenswrapper[4762]: I1014 13:07:50.972341 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"7e012188ef185c6f52bcea916cbfa6edea2c81015563b9c67ffb30b1d8918356"} Oct 14 13:07:50.973581 master-2 kubenswrapper[4762]: I1014 13:07:50.972358 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"80a1592e5f950a5be8591f775b540a1797997267deffa5664c89d94f660fa962"} Oct 14 13:07:50.973581 master-2 kubenswrapper[4762]: I1014 13:07:50.972376 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"62bd911c96d92eb232e058255595236f5ed93464c45015632c4bd34273ec38cd"} Oct 14 13:07:51.547806 master-2 kubenswrapper[4762]: I1014 13:07:51.547719 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:51.549314 master-2 kubenswrapper[4762]: E1014 13:07:51.549241 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:51.549432 master-2 kubenswrapper[4762]: I1014 13:07:51.549336 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:51.549526 master-2 kubenswrapper[4762]: E1014 13:07:51.549481 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:52.548668 master-2 kubenswrapper[4762]: I1014 13:07:52.548556 4762 scope.go:117] "RemoveContainer" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" Oct 14 13:07:52.549567 master-2 kubenswrapper[4762]: E1014 13:07:52.548965 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:07:52.987532 master-2 kubenswrapper[4762]: I1014 13:07:52.987428 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"3e201fc8244d6a6ba573956b6cabeda57a7e08a24ce1d10e829a60b95e4483a5"} Oct 14 13:07:53.548432 master-2 kubenswrapper[4762]: I1014 13:07:53.548223 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:53.548432 master-2 kubenswrapper[4762]: E1014 13:07:53.548335 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:53.548721 master-2 kubenswrapper[4762]: I1014 13:07:53.548470 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:53.548721 master-2 kubenswrapper[4762]: E1014 13:07:53.548679 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:55.547802 master-2 kubenswrapper[4762]: I1014 13:07:55.547738 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:55.547802 master-2 kubenswrapper[4762]: I1014 13:07:55.547769 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:55.548344 master-2 kubenswrapper[4762]: E1014 13:07:55.547946 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:55.548344 master-2 kubenswrapper[4762]: E1014 13:07:55.548182 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:56.006428 master-2 kubenswrapper[4762]: I1014 13:07:56.006095 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" event={"ID":"5a8ea9f0-8c47-4230-82cc-19ba4debe407","Type":"ContainerStarted","Data":"180f1d8444996cb7a7543424d77423512c5acebcc2110a221a1ff3dbc5fed09f"} Oct 14 13:07:56.006693 master-2 kubenswrapper[4762]: I1014 13:07:56.006616 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:56.006693 master-2 kubenswrapper[4762]: I1014 13:07:56.006672 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:56.006693 master-2 kubenswrapper[4762]: I1014 13:07:56.006695 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:07:56.040876 master-2 kubenswrapper[4762]: I1014 13:07:56.040748 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" podStartSLOduration=8.040715556 podStartE2EDuration="8.040715556s" podCreationTimestamp="2025-10-14 13:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:07:56.037917197 +0000 UTC m=+105.282076386" watchObservedRunningTime="2025-10-14 13:07:56.040715556 +0000 UTC m=+105.284874755" Oct 14 13:07:56.511101 master-2 kubenswrapper[4762]: I1014 13:07:56.510683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:56.511392 master-2 kubenswrapper[4762]: E1014 13:07:56.510975 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 14 13:07:56.511392 master-2 kubenswrapper[4762]: E1014 13:07:56.511207 4762 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 14 13:07:56.511392 master-2 kubenswrapper[4762]: E1014 13:07:56.511233 4762 projected.go:194] Error preparing data for projected volume kube-api-access-spkpp for pod openshift-network-diagnostics/network-check-target-cb5bh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:56.511392 master-2 kubenswrapper[4762]: E1014 13:07:56.511314 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp podName:f841b8dd-c459-4e20-b11a-9169905ad069 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:28.511291969 +0000 UTC m=+137.755451168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-spkpp" (UniqueName: "kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp") pod "network-check-target-cb5bh" (UID: "f841b8dd-c459-4e20-b11a-9169905ad069") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 14 13:07:57.548593 master-2 kubenswrapper[4762]: I1014 13:07:57.548493 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:57.548593 master-2 kubenswrapper[4762]: I1014 13:07:57.548556 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:57.549545 master-2 kubenswrapper[4762]: E1014 13:07:57.548715 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:57.549545 master-2 kubenswrapper[4762]: E1014 13:07:57.548854 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:07:59.548126 master-2 kubenswrapper[4762]: I1014 13:07:59.548027 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:07:59.548996 master-2 kubenswrapper[4762]: I1014 13:07:59.548026 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:07:59.548996 master-2 kubenswrapper[4762]: E1014 13:07:59.548246 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cb5bh" podUID="f841b8dd-c459-4e20-b11a-9169905ad069" Oct 14 13:07:59.548996 master-2 kubenswrapper[4762]: E1014 13:07:59.548498 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b84p7" podUID="c5e8cdcd-bc1f-4b38-a834-809b79de4fd9" Oct 14 13:08:00.153674 master-2 kubenswrapper[4762]: I1014 13:08:00.153451 4762 kubelet_node_status.go:724] "Recording event message for node" node="master-2" event="NodeReady" Oct 14 13:08:00.153674 master-2 kubenswrapper[4762]: I1014 13:08:00.153623 4762 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 14 13:08:00.182418 master-2 kubenswrapper[4762]: I1014 13:08:00.182329 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-m7hdw"] Oct 14 13:08:00.182802 master-2 kubenswrapper[4762]: I1014 13:08:00.182758 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.185603 master-2 kubenswrapper[4762]: I1014 13:08:00.185538 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 14 13:08:00.185793 master-2 kubenswrapper[4762]: I1014 13:08:00.185675 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 14 13:08:00.186433 master-2 kubenswrapper[4762]: I1014 13:08:00.186394 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 14 13:08:00.244669 master-2 kubenswrapper[4762]: I1014 13:08:00.244511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z5h5\" (UniqueName: \"kubernetes.io/projected/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-kube-api-access-2z5h5\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.244669 master-2 kubenswrapper[4762]: I1014 13:08:00.244624 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-host-slash\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.245056 master-2 kubenswrapper[4762]: I1014 13:08:00.244727 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-iptables-alerter-script\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.346079 master-2 kubenswrapper[4762]: I1014 13:08:00.345952 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-host-slash\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.346434 master-2 kubenswrapper[4762]: I1014 13:08:00.346109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-iptables-alerter-script\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.346434 master-2 kubenswrapper[4762]: I1014 13:08:00.346123 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-host-slash\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.346434 master-2 kubenswrapper[4762]: I1014 13:08:00.346223 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z5h5\" (UniqueName: \"kubernetes.io/projected/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-kube-api-access-2z5h5\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.347704 master-2 kubenswrapper[4762]: I1014 13:08:00.347626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-iptables-alerter-script\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.378848 master-2 kubenswrapper[4762]: I1014 13:08:00.378752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z5h5\" (UniqueName: \"kubernetes.io/projected/8ebdc7cc-3fe5-43b0-806c-62cca5fde537-kube-api-access-2z5h5\") pod \"iptables-alerter-m7hdw\" (UID: \"8ebdc7cc-3fe5-43b0-806c-62cca5fde537\") " pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.512269 master-2 kubenswrapper[4762]: I1014 13:08:00.512128 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7hdw" Oct 14 13:08:00.532898 master-2 kubenswrapper[4762]: W1014 13:08:00.532844 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ebdc7cc_3fe5_43b0_806c_62cca5fde537.slice/crio-6a00cfa5e7b708c3e4755c6e6a31bf11335057cfbf80b82ef5c5b9e4b441a87c WatchSource:0}: Error finding container 6a00cfa5e7b708c3e4755c6e6a31bf11335057cfbf80b82ef5c5b9e4b441a87c: Status 404 returned error can't find the container with id 6a00cfa5e7b708c3e4755c6e6a31bf11335057cfbf80b82ef5c5b9e4b441a87c Oct 14 13:08:01.023348 master-2 kubenswrapper[4762]: I1014 13:08:01.023279 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7hdw" event={"ID":"8ebdc7cc-3fe5-43b0-806c-62cca5fde537","Type":"ContainerStarted","Data":"6a00cfa5e7b708c3e4755c6e6a31bf11335057cfbf80b82ef5c5b9e4b441a87c"} Oct 14 13:08:01.548174 master-2 kubenswrapper[4762]: I1014 13:08:01.548110 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:08:01.548174 master-2 kubenswrapper[4762]: I1014 13:08:01.548110 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:08:01.553825 master-2 kubenswrapper[4762]: I1014 13:08:01.553356 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 14 13:08:01.553825 master-2 kubenswrapper[4762]: I1014 13:08:01.553359 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 14 13:08:01.553825 master-2 kubenswrapper[4762]: I1014 13:08:01.553536 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 14 13:08:05.549309 master-2 kubenswrapper[4762]: I1014 13:08:05.548871 4762 scope.go:117] "RemoveContainer" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" Oct 14 13:08:05.550655 master-2 kubenswrapper[4762]: E1014 13:08:05.549418 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:08:08.048562 master-2 kubenswrapper[4762]: I1014 13:08:08.048464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7hdw" event={"ID":"8ebdc7cc-3fe5-43b0-806c-62cca5fde537","Type":"ContainerStarted","Data":"41b9cdc937353c77a8ad673b66d376c4b1e451a2d0eaf4c2e8f0c7058650875f"} Oct 14 13:08:08.065014 master-2 kubenswrapper[4762]: I1014 13:08:08.064880 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m7hdw" podStartSLOduration=4.873015393 podStartE2EDuration="8.064850818s" podCreationTimestamp="2025-10-14 13:08:00 +0000 UTC" firstStartedPulling="2025-10-14 13:08:00.535804575 +0000 UTC m=+109.779963734" lastFinishedPulling="2025-10-14 13:08:03.72764 +0000 UTC m=+112.971799159" observedRunningTime="2025-10-14 13:08:08.06336505 +0000 UTC m=+117.307524249" watchObservedRunningTime="2025-10-14 13:08:08.064850818 +0000 UTC m=+117.309010007" Oct 14 13:08:08.851840 master-2 kubenswrapper[4762]: I1014 13:08:08.851789 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:08:08.852875 master-2 kubenswrapper[4762]: I1014 13:08:08.852812 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:08:08.873294 master-2 kubenswrapper[4762]: I1014 13:08:08.873238 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cthp" Oct 14 13:08:10.417305 master-2 kubenswrapper[4762]: I1014 13:08:10.417208 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/master-2-debug-xl9pk"] Oct 14 13:08:10.418094 master-2 kubenswrapper[4762]: I1014 13:08:10.417586 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.421259 master-2 kubenswrapper[4762]: I1014 13:08:10.421197 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Oct 14 13:08:10.421460 master-2 kubenswrapper[4762]: I1014 13:08:10.421402 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Oct 14 13:08:10.446996 master-2 kubenswrapper[4762]: I1014 13:08:10.446935 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.447241 master-2 kubenswrapper[4762]: I1014 13:08:10.446998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27zgn\" (UniqueName: \"kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.547657 master-2 kubenswrapper[4762]: I1014 13:08:10.547589 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.547970 master-2 kubenswrapper[4762]: I1014 13:08:10.547675 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27zgn\" (UniqueName: \"kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.548349 master-2 kubenswrapper[4762]: I1014 13:08:10.548292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.576120 master-2 kubenswrapper[4762]: I1014 13:08:10.576011 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27zgn\" (UniqueName: \"kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn\") pod \"master-2-debug-xl9pk\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:10.747543 master-2 kubenswrapper[4762]: I1014 13:08:10.747328 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:11.059389 master-2 kubenswrapper[4762]: I1014 13:08:11.059230 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/master-2-debug-xl9pk" event={"ID":"52e27da8-fd29-4cd8-ab27-7ae79aa112c3","Type":"ContainerStarted","Data":"cc5c3b228e373fd0cc007459299eb033686f080bf2e05f6b1bf82d1ab2bbf4be"} Oct 14 13:08:11.110912 master-2 kubenswrapper[4762]: I1014 13:08:11.110847 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v"] Oct 14 13:08:11.111352 master-2 kubenswrapper[4762]: I1014 13:08:11.111317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" Oct 14 13:08:11.114372 master-2 kubenswrapper[4762]: I1014 13:08:11.113801 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Oct 14 13:08:11.114372 master-2 kubenswrapper[4762]: I1014 13:08:11.114219 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Oct 14 13:08:11.118580 master-2 kubenswrapper[4762]: I1014 13:08:11.118521 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v"] Oct 14 13:08:11.253342 master-2 kubenswrapper[4762]: I1014 13:08:11.253021 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nj97\" (UniqueName: \"kubernetes.io/projected/0500c75a-3460-4279-a8d8-cebf242e6089-kube-api-access-2nj97\") pod \"csi-snapshot-controller-ddd7d64cd-hph6v\" (UID: \"0500c75a-3460-4279-a8d8-cebf242e6089\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" Oct 14 13:08:11.353880 master-2 kubenswrapper[4762]: I1014 13:08:11.353612 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nj97\" (UniqueName: \"kubernetes.io/projected/0500c75a-3460-4279-a8d8-cebf242e6089-kube-api-access-2nj97\") pod \"csi-snapshot-controller-ddd7d64cd-hph6v\" (UID: \"0500c75a-3460-4279-a8d8-cebf242e6089\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" Oct 14 13:08:11.370323 master-2 kubenswrapper[4762]: I1014 13:08:11.370236 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Oct 14 13:08:11.381121 master-2 kubenswrapper[4762]: I1014 13:08:11.381035 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Oct 14 13:08:11.395132 master-2 kubenswrapper[4762]: I1014 13:08:11.395006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nj97\" (UniqueName: \"kubernetes.io/projected/0500c75a-3460-4279-a8d8-cebf242e6089-kube-api-access-2nj97\") pod \"csi-snapshot-controller-ddd7d64cd-hph6v\" (UID: \"0500c75a-3460-4279-a8d8-cebf242e6089\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" Oct 14 13:08:11.437104 master-2 kubenswrapper[4762]: I1014 13:08:11.436991 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" Oct 14 13:08:11.682206 master-2 kubenswrapper[4762]: I1014 13:08:11.681630 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v"] Oct 14 13:08:11.692931 master-2 kubenswrapper[4762]: W1014 13:08:11.692871 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0500c75a_3460_4279_a8d8_cebf242e6089.slice/crio-200ffddf9bb9048b1d69d3c37250041cdb536551d09475796cab9044d2c98b23 WatchSource:0}: Error finding container 200ffddf9bb9048b1d69d3c37250041cdb536551d09475796cab9044d2c98b23: Status 404 returned error can't find the container with id 200ffddf9bb9048b1d69d3c37250041cdb536551d09475796cab9044d2c98b23 Oct 14 13:08:12.064664 master-2 kubenswrapper[4762]: I1014 13:08:12.064521 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" event={"ID":"0500c75a-3460-4279-a8d8-cebf242e6089","Type":"ContainerStarted","Data":"200ffddf9bb9048b1d69d3c37250041cdb536551d09475796cab9044d2c98b23"} Oct 14 13:08:12.113063 master-2 kubenswrapper[4762]: I1014 13:08:12.113018 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d9b59775c-5nrm2"] Oct 14 13:08:12.114134 master-2 kubenswrapper[4762]: I1014 13:08:12.114039 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.117686 master-2 kubenswrapper[4762]: I1014 13:08:12.117638 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:08:12.117861 master-2 kubenswrapper[4762]: I1014 13:08:12.117688 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:08:12.117861 master-2 kubenswrapper[4762]: I1014 13:08:12.117698 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:08:12.117861 master-2 kubenswrapper[4762]: I1014 13:08:12.117743 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:08:12.118775 master-2 kubenswrapper[4762]: I1014 13:08:12.118711 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:08:12.118962 master-2 kubenswrapper[4762]: I1014 13:08:12.118779 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:08:12.124373 master-2 kubenswrapper[4762]: I1014 13:08:12.124313 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b59775c-5nrm2"] Oct 14 13:08:12.262510 master-2 kubenswrapper[4762]: I1014 13:08:12.262305 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.262510 master-2 kubenswrapper[4762]: I1014 13:08:12.262385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.262510 master-2 kubenswrapper[4762]: I1014 13:08:12.262476 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmhw\" (UniqueName: \"kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.262743 master-2 kubenswrapper[4762]: I1014 13:08:12.262519 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.262743 master-2 kubenswrapper[4762]: I1014 13:08:12.262556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: I1014 13:08:12.363079 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: I1014 13:08:12.363128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: I1014 13:08:12.363160 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmhw\" (UniqueName: \"kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: I1014 13:08:12.363182 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: I1014 13:08:12.363200 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: E1014 13:08:12.363255 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: E1014 13:08:12.363292 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: E1014 13:08:12.363314 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Oct 14 13:08:12.363337 master-2 kubenswrapper[4762]: E1014 13:08:12.363356 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:12.863332261 +0000 UTC m=+122.107491431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "client-ca" not found Oct 14 13:08:12.363799 master-2 kubenswrapper[4762]: E1014 13:08:12.363378 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:12.863370333 +0000 UTC m=+122.107529502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "openshift-global-ca" not found Oct 14 13:08:12.363799 master-2 kubenswrapper[4762]: E1014 13:08:12.363418 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:12.863389033 +0000 UTC m=+122.107548232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "config" not found Oct 14 13:08:12.363799 master-2 kubenswrapper[4762]: E1014 13:08:12.363433 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:12.363799 master-2 kubenswrapper[4762]: E1014 13:08:12.363563 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:12.863533648 +0000 UTC m=+122.107692887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : secret "serving-cert" not found Oct 14 13:08:12.400220 master-2 kubenswrapper[4762]: I1014 13:08:12.400118 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmhw\" (UniqueName: \"kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: I1014 13:08:12.867378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: I1014 13:08:12.867436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: I1014 13:08:12.867466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: I1014 13:08:12.867483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: E1014 13:08:12.867582 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: E1014 13:08:12.867624 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: E1014 13:08:12.867751 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: E1014 13:08:12.867638 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.867619549 +0000 UTC m=+123.111778708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "openshift-global-ca" not found Oct 14 13:08:12.867823 master-2 kubenswrapper[4762]: E1014 13:08:12.867868 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.867846186 +0000 UTC m=+123.112005345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "client-ca" not found Oct 14 13:08:12.869148 master-2 kubenswrapper[4762]: E1014 13:08:12.867883 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.867875417 +0000 UTC m=+123.112034686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : secret "serving-cert" not found Oct 14 13:08:12.869148 master-2 kubenswrapper[4762]: E1014 13:08:12.867662 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Oct 14 13:08:12.869148 master-2 kubenswrapper[4762]: E1014 13:08:12.867915 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.867908518 +0000 UTC m=+123.112067757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "config" not found Oct 14 13:08:12.992612 master-2 kubenswrapper[4762]: I1014 13:08:12.992528 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b59775c-5nrm2"] Oct 14 13:08:12.992820 master-2 kubenswrapper[4762]: E1014 13:08:12.992712 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" podUID="ecce0f57-69a3-448a-8ed8-55822aa21231" Oct 14 13:08:13.005290 master-2 kubenswrapper[4762]: I1014 13:08:13.005246 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm"] Oct 14 13:08:13.005600 master-2 kubenswrapper[4762]: I1014 13:08:13.005564 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.008262 master-2 kubenswrapper[4762]: I1014 13:08:13.008232 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:08:13.008702 master-2 kubenswrapper[4762]: I1014 13:08:13.008677 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:08:13.008702 master-2 kubenswrapper[4762]: I1014 13:08:13.008689 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:08:13.008881 master-2 kubenswrapper[4762]: I1014 13:08:13.008706 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:08:13.008881 master-2 kubenswrapper[4762]: I1014 13:08:13.008801 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:08:13.019245 master-2 kubenswrapper[4762]: I1014 13:08:13.019117 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm"] Oct 14 13:08:13.066535 master-2 kubenswrapper[4762]: I1014 13:08:13.066497 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.069594 master-2 kubenswrapper[4762]: I1014 13:08:13.069544 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.074380 master-2 kubenswrapper[4762]: I1014 13:08:13.074208 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.161855 master-2 kubenswrapper[4762]: I1014 13:08:13.160536 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-64446499c7-ghfpb"] Oct 14 13:08:13.161855 master-2 kubenswrapper[4762]: I1014 13:08:13.161012 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.163239 master-2 kubenswrapper[4762]: I1014 13:08:13.162886 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 14 13:08:13.163239 master-2 kubenswrapper[4762]: I1014 13:08:13.163021 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 14 13:08:13.164255 master-2 kubenswrapper[4762]: I1014 13:08:13.163946 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 14 13:08:13.164255 master-2 kubenswrapper[4762]: I1014 13:08:13.164037 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 14 13:08:13.169250 master-2 kubenswrapper[4762]: I1014 13:08:13.169202 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-64446499c7-ghfpb"] Oct 14 13:08:13.171057 master-2 kubenswrapper[4762]: I1014 13:08:13.169888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvntw\" (UniqueName: \"kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.171057 master-2 kubenswrapper[4762]: I1014 13:08:13.169908 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.171057 master-2 kubenswrapper[4762]: I1014 13:08:13.169926 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.171057 master-2 kubenswrapper[4762]: I1014 13:08:13.169976 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.171057 master-2 kubenswrapper[4762]: I1014 13:08:13.170733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.270353 master-2 kubenswrapper[4762]: I1014 13:08:13.270313 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvmhw\" (UniqueName: \"kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw\") pod \"ecce0f57-69a3-448a-8ed8-55822aa21231\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270475 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a201bca6-8e62-4695-9164-1065fa1108a2-signing-key\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvntw\" (UniqueName: \"kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270556 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270662 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270751 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dqwq\" (UniqueName: \"kubernetes.io/projected/a201bca6-8e62-4695-9164-1065fa1108a2-kube-api-access-5dqwq\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.270938 master-2 kubenswrapper[4762]: I1014 13:08:13.270790 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a201bca6-8e62-4695-9164-1065fa1108a2-signing-cabundle\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.271263 master-2 kubenswrapper[4762]: E1014 13:08:13.270951 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:13.271263 master-2 kubenswrapper[4762]: E1014 13:08:13.271048 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.771022031 +0000 UTC m=+123.015181190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:13.271263 master-2 kubenswrapper[4762]: E1014 13:08:13.271192 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:13.271263 master-2 kubenswrapper[4762]: E1014 13:08:13.271229 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:13.771216548 +0000 UTC m=+123.015375917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:13.274543 master-2 kubenswrapper[4762]: I1014 13:08:13.274485 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw" (OuterVolumeSpecName: "kube-api-access-nvmhw") pod "ecce0f57-69a3-448a-8ed8-55822aa21231" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231"). InnerVolumeSpecName "kube-api-access-nvmhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:08:13.294429 master-2 kubenswrapper[4762]: I1014 13:08:13.294080 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvntw\" (UniqueName: \"kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.400070 master-2 kubenswrapper[4762]: I1014 13:08:13.399418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a201bca6-8e62-4695-9164-1065fa1108a2-signing-key\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.400070 master-2 kubenswrapper[4762]: I1014 13:08:13.399494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dqwq\" (UniqueName: \"kubernetes.io/projected/a201bca6-8e62-4695-9164-1065fa1108a2-kube-api-access-5dqwq\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.400070 master-2 kubenswrapper[4762]: I1014 13:08:13.399559 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a201bca6-8e62-4695-9164-1065fa1108a2-signing-cabundle\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.400070 master-2 kubenswrapper[4762]: I1014 13:08:13.399604 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvmhw\" (UniqueName: \"kubernetes.io/projected/ecce0f57-69a3-448a-8ed8-55822aa21231-kube-api-access-nvmhw\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:13.400562 master-2 kubenswrapper[4762]: I1014 13:08:13.400535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a201bca6-8e62-4695-9164-1065fa1108a2-signing-cabundle\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.405549 master-2 kubenswrapper[4762]: I1014 13:08:13.405513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a201bca6-8e62-4695-9164-1065fa1108a2-signing-key\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.427473 master-2 kubenswrapper[4762]: I1014 13:08:13.427363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dqwq\" (UniqueName: \"kubernetes.io/projected/a201bca6-8e62-4695-9164-1065fa1108a2-kube-api-access-5dqwq\") pod \"service-ca-64446499c7-ghfpb\" (UID: \"a201bca6-8e62-4695-9164-1065fa1108a2\") " pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.481232 master-2 kubenswrapper[4762]: I1014 13:08:13.481121 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-64446499c7-ghfpb" Oct 14 13:08:13.669139 master-2 kubenswrapper[4762]: I1014 13:08:13.669105 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-64446499c7-ghfpb"] Oct 14 13:08:13.677215 master-2 kubenswrapper[4762]: W1014 13:08:13.677187 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda201bca6_8e62_4695_9164_1065fa1108a2.slice/crio-d25ca59d4f139e8055e21e9142ca332f1f88056b82dabb37ec8fc814ed397a85 WatchSource:0}: Error finding container d25ca59d4f139e8055e21e9142ca332f1f88056b82dabb37ec8fc814ed397a85: Status 404 returned error can't find the container with id d25ca59d4f139e8055e21e9142ca332f1f88056b82dabb37ec8fc814ed397a85 Oct 14 13:08:13.803603 master-2 kubenswrapper[4762]: I1014 13:08:13.803548 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:08:13.803733 master-2 kubenswrapper[4762]: I1014 13:08:13.803623 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.803733 master-2 kubenswrapper[4762]: I1014 13:08:13.803659 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:13.803861 master-2 kubenswrapper[4762]: E1014 13:08:13.803802 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:13.803937 master-2 kubenswrapper[4762]: E1014 13:08:13.803828 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:13.803937 master-2 kubenswrapper[4762]: E1014 13:08:13.803934 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:14.80390755 +0000 UTC m=+124.048066739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:13.804031 master-2 kubenswrapper[4762]: E1014 13:08:13.803992 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:14.803967182 +0000 UTC m=+124.048126381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:13.807093 master-2 kubenswrapper[4762]: I1014 13:08:13.807050 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 14 13:08:13.814693 master-2 kubenswrapper[4762]: E1014 13:08:13.814634 4762 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Oct 14 13:08:13.814781 master-2 kubenswrapper[4762]: E1014 13:08:13.814716 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs podName:c5e8cdcd-bc1f-4b38-a834-809b79de4fd9 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:17.814689423 +0000 UTC m=+187.058848592 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs") pod "network-metrics-daemon-b84p7" (UID: "c5e8cdcd-bc1f-4b38-a834-809b79de4fd9") : secret "metrics-daemon-secret" not found Oct 14 13:08:13.904563 master-2 kubenswrapper[4762]: I1014 13:08:13.904492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.905321 master-2 kubenswrapper[4762]: I1014 13:08:13.904567 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.905321 master-2 kubenswrapper[4762]: I1014 13:08:13.904617 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.905321 master-2 kubenswrapper[4762]: I1014 13:08:13.904651 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.905321 master-2 kubenswrapper[4762]: E1014 13:08:13.904716 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:13.905321 master-2 kubenswrapper[4762]: E1014 13:08:13.904826 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:15.904798325 +0000 UTC m=+125.148957524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : configmap "client-ca" not found Oct 14 13:08:13.905721 master-2 kubenswrapper[4762]: E1014 13:08:13.905623 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:13.905863 master-2 kubenswrapper[4762]: E1014 13:08:13.905834 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert podName:ecce0f57-69a3-448a-8ed8-55822aa21231 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:15.905795826 +0000 UTC m=+125.149955015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert") pod "controller-manager-5d9b59775c-5nrm2" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231") : secret "serving-cert" not found Oct 14 13:08:13.906239 master-2 kubenswrapper[4762]: I1014 13:08:13.906195 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:13.906479 master-2 kubenswrapper[4762]: I1014 13:08:13.906435 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"controller-manager-5d9b59775c-5nrm2\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:14.005864 master-2 kubenswrapper[4762]: I1014 13:08:14.005760 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") pod \"ecce0f57-69a3-448a-8ed8-55822aa21231\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " Oct 14 13:08:14.006087 master-2 kubenswrapper[4762]: I1014 13:08:14.005887 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") pod \"ecce0f57-69a3-448a-8ed8-55822aa21231\" (UID: \"ecce0f57-69a3-448a-8ed8-55822aa21231\") " Oct 14 13:08:14.006703 master-2 kubenswrapper[4762]: I1014 13:08:14.006627 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config" (OuterVolumeSpecName: "config") pod "ecce0f57-69a3-448a-8ed8-55822aa21231" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:14.007032 master-2 kubenswrapper[4762]: I1014 13:08:14.006964 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ecce0f57-69a3-448a-8ed8-55822aa21231" (UID: "ecce0f57-69a3-448a-8ed8-55822aa21231"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:14.072063 master-2 kubenswrapper[4762]: I1014 13:08:14.071995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" event={"ID":"0500c75a-3460-4279-a8d8-cebf242e6089","Type":"ContainerStarted","Data":"f1826adc7eb4b42b9fe04d89bcd7c131c240322cb4e3c10c8dc7650224a29fba"} Oct 14 13:08:14.073423 master-2 kubenswrapper[4762]: I1014 13:08:14.073370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-64446499c7-ghfpb" event={"ID":"a201bca6-8e62-4695-9164-1065fa1108a2","Type":"ContainerStarted","Data":"d25ca59d4f139e8055e21e9142ca332f1f88056b82dabb37ec8fc814ed397a85"} Oct 14 13:08:14.073509 master-2 kubenswrapper[4762]: I1014 13:08:14.073421 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b59775c-5nrm2" Oct 14 13:08:14.086173 master-2 kubenswrapper[4762]: I1014 13:08:14.086080 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" podStartSLOduration=1.617733958 podStartE2EDuration="3.086057069s" podCreationTimestamp="2025-10-14 13:08:11 +0000 UTC" firstStartedPulling="2025-10-14 13:08:11.695857936 +0000 UTC m=+120.940017095" lastFinishedPulling="2025-10-14 13:08:13.164181047 +0000 UTC m=+122.408340206" observedRunningTime="2025-10-14 13:08:14.083798638 +0000 UTC m=+123.327957877" watchObservedRunningTime="2025-10-14 13:08:14.086057069 +0000 UTC m=+123.330216268" Oct 14 13:08:14.105298 master-2 kubenswrapper[4762]: I1014 13:08:14.105119 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6687f866cc-2f4dq"] Oct 14 13:08:14.106794 master-2 kubenswrapper[4762]: I1014 13:08:14.105927 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.106794 master-2 kubenswrapper[4762]: I1014 13:08:14.106667 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-proxy-ca-bundles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:14.106794 master-2 kubenswrapper[4762]: I1014 13:08:14.106713 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:14.109207 master-2 kubenswrapper[4762]: I1014 13:08:14.109131 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:08:14.109411 master-2 kubenswrapper[4762]: I1014 13:08:14.109383 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:08:14.109534 master-2 kubenswrapper[4762]: I1014 13:08:14.109385 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:08:14.110036 master-2 kubenswrapper[4762]: I1014 13:08:14.110006 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:08:14.110430 master-2 kubenswrapper[4762]: I1014 13:08:14.110396 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b59775c-5nrm2"] Oct 14 13:08:14.112450 master-2 kubenswrapper[4762]: I1014 13:08:14.112414 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:08:14.113629 master-2 kubenswrapper[4762]: I1014 13:08:14.113580 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b59775c-5nrm2"] Oct 14 13:08:14.123554 master-2 kubenswrapper[4762]: I1014 13:08:14.123430 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6687f866cc-2f4dq"] Oct 14 13:08:14.133926 master-2 kubenswrapper[4762]: I1014 13:08:14.133863 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:08:14.207371 master-2 kubenswrapper[4762]: I1014 13:08:14.207316 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwqlf\" (UniqueName: \"kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.207557 master-2 kubenswrapper[4762]: I1014 13:08:14.207427 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.207557 master-2 kubenswrapper[4762]: I1014 13:08:14.207472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.207557 master-2 kubenswrapper[4762]: I1014 13:08:14.207522 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.207557 master-2 kubenswrapper[4762]: I1014 13:08:14.207544 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.207726 master-2 kubenswrapper[4762]: I1014 13:08:14.207574 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecce0f57-69a3-448a-8ed8-55822aa21231-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:14.207726 master-2 kubenswrapper[4762]: I1014 13:08:14.207587 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecce0f57-69a3-448a-8ed8-55822aa21231-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:14.309035 master-2 kubenswrapper[4762]: I1014 13:08:14.308892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwqlf\" (UniqueName: \"kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.309035 master-2 kubenswrapper[4762]: I1014 13:08:14.309012 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.309035 master-2 kubenswrapper[4762]: I1014 13:08:14.309038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.309337 master-2 kubenswrapper[4762]: I1014 13:08:14.309092 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.309337 master-2 kubenswrapper[4762]: I1014 13:08:14.309116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.309337 master-2 kubenswrapper[4762]: E1014 13:08:14.309234 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:14.309337 master-2 kubenswrapper[4762]: E1014 13:08:14.309248 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:14.309337 master-2 kubenswrapper[4762]: E1014 13:08:14.309292 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:14.809273731 +0000 UTC m=+124.053432910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:14.309554 master-2 kubenswrapper[4762]: E1014 13:08:14.309362 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:14.809333413 +0000 UTC m=+124.053492602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : secret "serving-cert" not found Oct 14 13:08:14.310745 master-2 kubenswrapper[4762]: I1014 13:08:14.310695 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.311019 master-2 kubenswrapper[4762]: I1014 13:08:14.310952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.332730 master-2 kubenswrapper[4762]: I1014 13:08:14.332661 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwqlf\" (UniqueName: \"kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: I1014 13:08:14.813776 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: I1014 13:08:14.813892 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: I1014 13:08:14.813951 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: I1014 13:08:14.814141 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: E1014 13:08:14.814476 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: E1014 13:08:14.814485 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: E1014 13:08:14.814589 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:14.814671 master-2 kubenswrapper[4762]: E1014 13:08:14.814603 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:14.815255 master-2 kubenswrapper[4762]: E1014 13:08:14.814611 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:15.814571301 +0000 UTC m=+125.058730510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : secret "serving-cert" not found Oct 14 13:08:14.815255 master-2 kubenswrapper[4762]: E1014 13:08:14.814794 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:16.814765117 +0000 UTC m=+126.058924466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:14.815255 master-2 kubenswrapper[4762]: E1014 13:08:14.814810 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:15.814803398 +0000 UTC m=+125.058962797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:14.815255 master-2 kubenswrapper[4762]: E1014 13:08:14.814828 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:16.814818469 +0000 UTC m=+126.058977868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:15.553047 master-2 kubenswrapper[4762]: I1014 13:08:15.552995 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecce0f57-69a3-448a-8ed8-55822aa21231" path="/var/lib/kubelet/pods/ecce0f57-69a3-448a-8ed8-55822aa21231/volumes" Oct 14 13:08:15.823000 master-2 kubenswrapper[4762]: I1014 13:08:15.822848 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:15.823000 master-2 kubenswrapper[4762]: I1014 13:08:15.822969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:15.823332 master-2 kubenswrapper[4762]: E1014 13:08:15.823031 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:15.823332 master-2 kubenswrapper[4762]: E1014 13:08:15.823080 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:15.823332 master-2 kubenswrapper[4762]: E1014 13:08:15.823087 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:17.823072482 +0000 UTC m=+127.067231641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : secret "serving-cert" not found Oct 14 13:08:15.823332 master-2 kubenswrapper[4762]: E1014 13:08:15.823147 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:17.823130304 +0000 UTC m=+127.067289503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:16.832880 master-2 kubenswrapper[4762]: I1014 13:08:16.832819 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:16.832880 master-2 kubenswrapper[4762]: I1014 13:08:16.832890 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:16.833427 master-2 kubenswrapper[4762]: E1014 13:08:16.832977 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:16.833427 master-2 kubenswrapper[4762]: E1014 13:08:16.833024 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:16.833427 master-2 kubenswrapper[4762]: E1014 13:08:16.833079 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:20.833063212 +0000 UTC m=+130.077222371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:16.833427 master-2 kubenswrapper[4762]: E1014 13:08:16.833095 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:20.833087253 +0000 UTC m=+130.077246412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:17.844602 master-2 kubenswrapper[4762]: I1014 13:08:17.843838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:17.845482 master-2 kubenswrapper[4762]: I1014 13:08:17.844651 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:17.845482 master-2 kubenswrapper[4762]: E1014 13:08:17.844733 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:17.845482 master-2 kubenswrapper[4762]: E1014 13:08:17.844809 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:17.845482 master-2 kubenswrapper[4762]: E1014 13:08:17.844847 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:21.844826277 +0000 UTC m=+131.088985436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : secret "serving-cert" not found Oct 14 13:08:17.845482 master-2 kubenswrapper[4762]: E1014 13:08:17.844891 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:21.844866138 +0000 UTC m=+131.089025317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:18.088779 master-2 kubenswrapper[4762]: I1014 13:08:18.085037 4762 generic.go:334] "Generic (PLEG): container finished" podID="52e27da8-fd29-4cd8-ab27-7ae79aa112c3" containerID="e964c379732f90e3c86519097e84ce0e7165b50a4cbac704a886446518e2938f" exitCode=0 Oct 14 13:08:18.088779 master-2 kubenswrapper[4762]: I1014 13:08:18.085188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/master-2-debug-xl9pk" event={"ID":"52e27da8-fd29-4cd8-ab27-7ae79aa112c3","Type":"ContainerDied","Data":"e964c379732f90e3c86519097e84ce0e7165b50a4cbac704a886446518e2938f"} Oct 14 13:08:18.132445 master-2 kubenswrapper[4762]: I1014 13:08:18.132350 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["assisted-installer/master-2-debug-xl9pk"] Oct 14 13:08:18.136895 master-2 kubenswrapper[4762]: I1014 13:08:18.136830 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["assisted-installer/master-2-debug-xl9pk"] Oct 14 13:08:18.548306 master-2 kubenswrapper[4762]: I1014 13:08:18.547900 4762 scope.go:117] "RemoveContainer" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" Oct 14 13:08:19.092178 master-2 kubenswrapper[4762]: I1014 13:08:19.092094 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/3.log" Oct 14 13:08:19.096001 master-2 kubenswrapper[4762]: I1014 13:08:19.095957 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-64446499c7-ghfpb" event={"ID":"a201bca6-8e62-4695-9164-1065fa1108a2","Type":"ContainerStarted","Data":"6395648d34513fd3d81e6d6d67569d80457d39d6ae569e126ebdd7a7c8ae0c4e"} Oct 14 13:08:19.134022 master-2 kubenswrapper[4762]: I1014 13:08:19.133970 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:19.157385 master-2 kubenswrapper[4762]: I1014 13:08:19.157332 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host\") pod \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " Oct 14 13:08:19.157650 master-2 kubenswrapper[4762]: I1014 13:08:19.157410 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27zgn\" (UniqueName: \"kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn\") pod \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\" (UID: \"52e27da8-fd29-4cd8-ab27-7ae79aa112c3\") " Oct 14 13:08:19.157844 master-2 kubenswrapper[4762]: I1014 13:08:19.157795 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host" (OuterVolumeSpecName: "host") pod "52e27da8-fd29-4cd8-ab27-7ae79aa112c3" (UID: "52e27da8-fd29-4cd8-ab27-7ae79aa112c3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:08:19.164602 master-2 kubenswrapper[4762]: I1014 13:08:19.164545 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn" (OuterVolumeSpecName: "kube-api-access-27zgn") pod "52e27da8-fd29-4cd8-ab27-7ae79aa112c3" (UID: "52e27da8-fd29-4cd8-ab27-7ae79aa112c3"). InnerVolumeSpecName "kube-api-access-27zgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:08:19.259009 master-2 kubenswrapper[4762]: I1014 13:08:19.258886 4762 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-host\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:19.259009 master-2 kubenswrapper[4762]: I1014 13:08:19.258931 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27zgn\" (UniqueName: \"kubernetes.io/projected/52e27da8-fd29-4cd8-ab27-7ae79aa112c3-kube-api-access-27zgn\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:19.555933 master-2 kubenswrapper[4762]: I1014 13:08:19.555839 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e27da8-fd29-4cd8-ab27-7ae79aa112c3" path="/var/lib/kubelet/pods/52e27da8-fd29-4cd8-ab27-7ae79aa112c3/volumes" Oct 14 13:08:20.102403 master-2 kubenswrapper[4762]: I1014 13:08:20.102348 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 13:08:20.103968 master-2 kubenswrapper[4762]: I1014 13:08:20.103910 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/3.log" Oct 14 13:08:20.105095 master-2 kubenswrapper[4762]: I1014 13:08:20.105035 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" exitCode=1 Oct 14 13:08:20.105256 master-2 kubenswrapper[4762]: I1014 13:08:20.105115 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a"} Oct 14 13:08:20.105256 master-2 kubenswrapper[4762]: I1014 13:08:20.105176 4762 scope.go:117] "RemoveContainer" containerID="24baf780158d33f49e8bc55431e7447f00a45df3ba8830080f7795c97b34c03b" Oct 14 13:08:20.105907 master-2 kubenswrapper[4762]: I1014 13:08:20.105865 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:08:20.108130 master-2 kubenswrapper[4762]: E1014 13:08:20.106269 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:08:20.113304 master-2 kubenswrapper[4762]: I1014 13:08:20.111591 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/master-2-debug-xl9pk" Oct 14 13:08:20.144809 master-2 kubenswrapper[4762]: I1014 13:08:20.144712 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-64446499c7-ghfpb" podStartSLOduration=1.991096744 podStartE2EDuration="7.144689073s" podCreationTimestamp="2025-10-14 13:08:13 +0000 UTC" firstStartedPulling="2025-10-14 13:08:13.680201298 +0000 UTC m=+122.924360457" lastFinishedPulling="2025-10-14 13:08:18.833793617 +0000 UTC m=+128.077952786" observedRunningTime="2025-10-14 13:08:20.144437845 +0000 UTC m=+129.388597034" watchObservedRunningTime="2025-10-14 13:08:20.144689073 +0000 UTC m=+129.388848262" Oct 14 13:08:20.151577 master-2 kubenswrapper[4762]: I1014 13:08:20.151482 4762 scope.go:117] "RemoveContainer" containerID="e964c379732f90e3c86519097e84ce0e7165b50a4cbac704a886446518e2938f" Oct 14 13:08:20.876481 master-2 kubenswrapper[4762]: I1014 13:08:20.876346 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:20.876481 master-2 kubenswrapper[4762]: I1014 13:08:20.876432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:20.876918 master-2 kubenswrapper[4762]: E1014 13:08:20.876508 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:20.876918 master-2 kubenswrapper[4762]: E1014 13:08:20.876613 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:28.876585332 +0000 UTC m=+138.120744521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:20.876918 master-2 kubenswrapper[4762]: E1014 13:08:20.876663 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:20.876918 master-2 kubenswrapper[4762]: E1014 13:08:20.876741 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:28.876717246 +0000 UTC m=+138.120876445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:21.117721 master-2 kubenswrapper[4762]: I1014 13:08:21.117645 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 13:08:21.885259 master-2 kubenswrapper[4762]: I1014 13:08:21.885204 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:21.885849 master-2 kubenswrapper[4762]: I1014 13:08:21.885818 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:21.886056 master-2 kubenswrapper[4762]: E1014 13:08:21.885403 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:21.886267 master-2 kubenswrapper[4762]: E1014 13:08:21.885991 4762 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:21.886445 master-2 kubenswrapper[4762]: E1014 13:08:21.886410 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:29.8862295 +0000 UTC m=+139.130388699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:21.886783 master-2 kubenswrapper[4762]: E1014 13:08:21.886760 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:29.886734756 +0000 UTC m=+139.130893945 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : secret "serving-cert" not found Oct 14 13:08:22.477235 master-2 kubenswrapper[4762]: I1014 13:08:22.477136 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-5c6d48559d-44pcq"] Oct 14 13:08:22.478076 master-2 kubenswrapper[4762]: E1014 13:08:22.477351 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e27da8-fd29-4cd8-ab27-7ae79aa112c3" containerName="container-00" Oct 14 13:08:22.478076 master-2 kubenswrapper[4762]: I1014 13:08:22.477377 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e27da8-fd29-4cd8-ab27-7ae79aa112c3" containerName="container-00" Oct 14 13:08:22.478076 master-2 kubenswrapper[4762]: I1014 13:08:22.477474 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e27da8-fd29-4cd8-ab27-7ae79aa112c3" containerName="container-00" Oct 14 13:08:22.478423 master-2 kubenswrapper[4762]: I1014 13:08:22.478151 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.480635 master-2 kubenswrapper[4762]: I1014 13:08:22.480563 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:08:22.482300 master-2 kubenswrapper[4762]: I1014 13:08:22.482247 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:08:22.482558 master-2 kubenswrapper[4762]: I1014 13:08:22.482517 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:08:22.483061 master-2 kubenswrapper[4762]: I1014 13:08:22.483013 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:08:22.483378 master-2 kubenswrapper[4762]: I1014 13:08:22.483347 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Oct 14 13:08:22.483589 master-2 kubenswrapper[4762]: I1014 13:08:22.483563 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Oct 14 13:08:22.483821 master-2 kubenswrapper[4762]: I1014 13:08:22.483786 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:08:22.484444 master-2 kubenswrapper[4762]: I1014 13:08:22.484410 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:08:22.484712 master-2 kubenswrapper[4762]: I1014 13:08:22.484669 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:08:22.493402 master-2 kubenswrapper[4762]: I1014 13:08:22.493351 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:08:22.532208 master-2 kubenswrapper[4762]: I1014 13:08:22.532097 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5c6d48559d-44pcq"] Oct 14 13:08:22.564273 master-2 kubenswrapper[4762]: I1014 13:08:22.564176 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs"] Oct 14 13:08:22.564954 master-2 kubenswrapper[4762]: I1014 13:08:22.564908 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" Oct 14 13:08:22.569030 master-2 kubenswrapper[4762]: I1014 13:08:22.568961 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 14 13:08:22.569390 master-2 kubenswrapper[4762]: I1014 13:08:22.569082 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 14 13:08:22.575926 master-2 kubenswrapper[4762]: I1014 13:08:22.575858 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs"] Oct 14 13:08:22.592414 master-2 kubenswrapper[4762]: I1014 13:08:22.592312 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592414 master-2 kubenswrapper[4762]: I1014 13:08:22.592409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592809 master-2 kubenswrapper[4762]: I1014 13:08:22.592605 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592809 master-2 kubenswrapper[4762]: I1014 13:08:22.592663 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m96gv\" (UniqueName: \"kubernetes.io/projected/2c2367f1-c2dc-4400-9168-f06d3e321081-kube-api-access-m96gv\") pod \"migrator-d8c4d9469-hbqzs\" (UID: \"2c2367f1-c2dc-4400-9168-f06d3e321081\") " pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" Oct 14 13:08:22.592809 master-2 kubenswrapper[4762]: I1014 13:08:22.592715 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592809 master-2 kubenswrapper[4762]: I1014 13:08:22.592763 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592967 master-2 kubenswrapper[4762]: I1014 13:08:22.592816 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.592967 master-2 kubenswrapper[4762]: I1014 13:08:22.592871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.593040 master-2 kubenswrapper[4762]: I1014 13:08:22.592967 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpks\" (UniqueName: \"kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.593123 master-2 kubenswrapper[4762]: I1014 13:08:22.593074 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.593200 master-2 kubenswrapper[4762]: I1014 13:08:22.593132 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.593296 master-2 kubenswrapper[4762]: I1014 13:08:22.593255 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.694260 master-2 kubenswrapper[4762]: I1014 13:08:22.694129 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpks\" (UniqueName: \"kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.694260 master-2 kubenswrapper[4762]: I1014 13:08:22.694270 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.694864 master-2 kubenswrapper[4762]: I1014 13:08:22.694364 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.695887 master-2 kubenswrapper[4762]: I1014 13:08:22.695822 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.695887 master-2 kubenswrapper[4762]: I1014 13:08:22.695858 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696079 master-2 kubenswrapper[4762]: I1014 13:08:22.694978 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696079 master-2 kubenswrapper[4762]: I1014 13:08:22.696019 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696465 master-2 kubenswrapper[4762]: I1014 13:08:22.696116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696465 master-2 kubenswrapper[4762]: I1014 13:08:22.696415 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696465 master-2 kubenswrapper[4762]: E1014 13:08:22.696434 4762 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Oct 14 13:08:22.696465 master-2 kubenswrapper[4762]: I1014 13:08:22.696453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m96gv\" (UniqueName: \"kubernetes.io/projected/2c2367f1-c2dc-4400-9168-f06d3e321081-kube-api-access-m96gv\") pod \"migrator-d8c4d9469-hbqzs\" (UID: \"2c2367f1-c2dc-4400-9168-f06d3e321081\") " pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: I1014 13:08:22.696488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: E1014 13:08:22.696506 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit podName:be28f050-844b-4865-b27b-d724a630773d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:23.196483535 +0000 UTC m=+132.440642724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit") pod "apiserver-5c6d48559d-44pcq" (UID: "be28f050-844b-4865-b27b-d724a630773d") : configmap "audit-0" not found Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: I1014 13:08:22.696521 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: I1014 13:08:22.696537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: I1014 13:08:22.696598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.696779 master-2 kubenswrapper[4762]: I1014 13:08:22.696618 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.697235 master-2 kubenswrapper[4762]: I1014 13:08:22.697140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.697372 master-2 kubenswrapper[4762]: I1014 13:08:22.697333 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.697537 master-2 kubenswrapper[4762]: I1014 13:08:22.697461 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.701587 master-2 kubenswrapper[4762]: I1014 13:08:22.701535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.702450 master-2 kubenswrapper[4762]: I1014 13:08:22.702395 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.702684 master-2 kubenswrapper[4762]: I1014 13:08:22.702637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.714113 master-2 kubenswrapper[4762]: I1014 13:08:22.714047 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m96gv\" (UniqueName: \"kubernetes.io/projected/2c2367f1-c2dc-4400-9168-f06d3e321081-kube-api-access-m96gv\") pod \"migrator-d8c4d9469-hbqzs\" (UID: \"2c2367f1-c2dc-4400-9168-f06d3e321081\") " pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" Oct 14 13:08:22.716143 master-2 kubenswrapper[4762]: I1014 13:08:22.716095 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpks\" (UniqueName: \"kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:22.885526 master-2 kubenswrapper[4762]: I1014 13:08:22.885273 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" Oct 14 13:08:23.119058 master-2 kubenswrapper[4762]: I1014 13:08:23.118594 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs"] Oct 14 13:08:23.128957 master-2 kubenswrapper[4762]: W1014 13:08:23.128846 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2367f1_c2dc_4400_9168_f06d3e321081.slice/crio-d469776815967df8182b8b27cf9fe41ff034625a4914d1d30dd51ce423ec9f98 WatchSource:0}: Error finding container d469776815967df8182b8b27cf9fe41ff034625a4914d1d30dd51ce423ec9f98: Status 404 returned error can't find the container with id d469776815967df8182b8b27cf9fe41ff034625a4914d1d30dd51ce423ec9f98 Oct 14 13:08:23.203238 master-2 kubenswrapper[4762]: I1014 13:08:23.203060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:23.203472 master-2 kubenswrapper[4762]: E1014 13:08:23.203303 4762 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Oct 14 13:08:23.203472 master-2 kubenswrapper[4762]: E1014 13:08:23.203399 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit podName:be28f050-844b-4865-b27b-d724a630773d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:24.203372645 +0000 UTC m=+133.447531844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit") pod "apiserver-5c6d48559d-44pcq" (UID: "be28f050-844b-4865-b27b-d724a630773d") : configmap "audit-0" not found Oct 14 13:08:24.134211 master-2 kubenswrapper[4762]: I1014 13:08:24.133872 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" event={"ID":"2c2367f1-c2dc-4400-9168-f06d3e321081","Type":"ContainerStarted","Data":"d469776815967df8182b8b27cf9fe41ff034625a4914d1d30dd51ce423ec9f98"} Oct 14 13:08:24.214490 master-2 kubenswrapper[4762]: I1014 13:08:24.214427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:24.214707 master-2 kubenswrapper[4762]: E1014 13:08:24.214564 4762 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Oct 14 13:08:24.214707 master-2 kubenswrapper[4762]: E1014 13:08:24.214627 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit podName:be28f050-844b-4865-b27b-d724a630773d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:26.214609684 +0000 UTC m=+135.458768833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit") pod "apiserver-5c6d48559d-44pcq" (UID: "be28f050-844b-4865-b27b-d724a630773d") : configmap "audit-0" not found Oct 14 13:08:25.140112 master-2 kubenswrapper[4762]: I1014 13:08:25.139946 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" event={"ID":"2c2367f1-c2dc-4400-9168-f06d3e321081","Type":"ContainerStarted","Data":"693acc9f5aaf4061784c21c02df2b94d6c3d24bec3d1ed82f2db31002a2bcde2"} Oct 14 13:08:25.140112 master-2 kubenswrapper[4762]: I1014 13:08:25.140009 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" event={"ID":"2c2367f1-c2dc-4400-9168-f06d3e321081","Type":"ContainerStarted","Data":"9168af5b10037c5c63d8ae1179493a352bec21546ae5d6ed28baec3be57722ca"} Oct 14 13:08:25.154868 master-2 kubenswrapper[4762]: I1014 13:08:25.154794 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-d8c4d9469-hbqzs" podStartSLOduration=1.552733306 podStartE2EDuration="3.154772569s" podCreationTimestamp="2025-10-14 13:08:22 +0000 UTC" firstStartedPulling="2025-10-14 13:08:23.132796496 +0000 UTC m=+132.376955695" lastFinishedPulling="2025-10-14 13:08:24.734835799 +0000 UTC m=+133.978994958" observedRunningTime="2025-10-14 13:08:25.153672994 +0000 UTC m=+134.397832183" watchObservedRunningTime="2025-10-14 13:08:25.154772569 +0000 UTC m=+134.398931758" Oct 14 13:08:26.061867 master-2 kubenswrapper[4762]: I1014 13:08:26.061768 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5c6d48559d-44pcq"] Oct 14 13:08:26.062212 master-2 kubenswrapper[4762]: E1014 13:08:26.062147 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" podUID="be28f050-844b-4865-b27b-d724a630773d" Oct 14 13:08:26.143377 master-2 kubenswrapper[4762]: I1014 13:08:26.143318 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:26.150855 master-2 kubenswrapper[4762]: I1014 13:08:26.150789 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:26.235365 master-2 kubenswrapper[4762]: I1014 13:08:26.235233 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235365 master-2 kubenswrapper[4762]: I1014 13:08:26.235334 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235399 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fpks\" (UniqueName: \"kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235458 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235513 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235591 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235637 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.235691 master-2 kubenswrapper[4762]: I1014 13:08:26.235653 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:08:26.236012 master-2 kubenswrapper[4762]: I1014 13:08:26.235686 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.236012 master-2 kubenswrapper[4762]: I1014 13:08:26.235795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.236012 master-2 kubenswrapper[4762]: I1014 13:08:26.235843 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client\") pod \"be28f050-844b-4865-b27b-d724a630773d\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " Oct 14 13:08:26.236012 master-2 kubenswrapper[4762]: I1014 13:08:26.235912 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: I1014 13:08:26.236092 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") pod \"apiserver-5c6d48559d-44pcq\" (UID: \"be28f050-844b-4865-b27b-d724a630773d\") " pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: I1014 13:08:26.236139 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: I1014 13:08:26.236189 4762 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be28f050-844b-4865-b27b-d724a630773d-node-pullsecrets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: E1014 13:08:26.236389 4762 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: E1014 13:08:26.236514 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit podName:be28f050-844b-4865-b27b-d724a630773d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:30.236481144 +0000 UTC m=+139.480640333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit") pod "apiserver-5c6d48559d-44pcq" (UID: "be28f050-844b-4865-b27b-d724a630773d") : configmap "audit-0" not found Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: I1014 13:08:26.236931 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:26.237083 master-2 kubenswrapper[4762]: I1014 13:08:26.236968 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config" (OuterVolumeSpecName: "config") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:26.238242 master-2 kubenswrapper[4762]: I1014 13:08:26.238135 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:26.239315 master-2 kubenswrapper[4762]: I1014 13:08:26.238835 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:26.243347 master-2 kubenswrapper[4762]: I1014 13:08:26.243268 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks" (OuterVolumeSpecName: "kube-api-access-7fpks") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "kube-api-access-7fpks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:08:26.243571 master-2 kubenswrapper[4762]: I1014 13:08:26.243506 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:08:26.243658 master-2 kubenswrapper[4762]: I1014 13:08:26.243618 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:08:26.243827 master-2 kubenswrapper[4762]: I1014 13:08:26.243767 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be28f050-844b-4865-b27b-d724a630773d" (UID: "be28f050-844b-4865-b27b-d724a630773d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.336989 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-image-import-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337043 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337068 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337088 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337106 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337126 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be28f050-844b-4865-b27b-d724a630773d-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337145 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:26.337223 master-2 kubenswrapper[4762]: I1014 13:08:26.337194 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fpks\" (UniqueName: \"kubernetes.io/projected/be28f050-844b-4865-b27b-d724a630773d-kube-api-access-7fpks\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:27.146388 master-2 kubenswrapper[4762]: I1014 13:08:27.146322 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5c6d48559d-44pcq" Oct 14 13:08:27.186620 master-2 kubenswrapper[4762]: I1014 13:08:27.186523 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:08:27.187611 master-2 kubenswrapper[4762]: I1014 13:08:27.187575 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5c6d48559d-44pcq"] Oct 14 13:08:27.187797 master-2 kubenswrapper[4762]: I1014 13:08:27.187757 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.191595 master-2 kubenswrapper[4762]: I1014 13:08:27.191567 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:08:27.191917 master-2 kubenswrapper[4762]: I1014 13:08:27.191897 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:08:27.192071 master-2 kubenswrapper[4762]: I1014 13:08:27.192044 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:08:27.193053 master-2 kubenswrapper[4762]: I1014 13:08:27.192946 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-5c6d48559d-44pcq"] Oct 14 13:08:27.193333 master-2 kubenswrapper[4762]: I1014 13:08:27.193303 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:08:27.193782 master-2 kubenswrapper[4762]: I1014 13:08:27.193747 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:08:27.194080 master-2 kubenswrapper[4762]: I1014 13:08:27.194026 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:08:27.194233 master-2 kubenswrapper[4762]: I1014 13:08:27.194135 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:08:27.194657 master-2 kubenswrapper[4762]: I1014 13:08:27.194636 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 14 13:08:27.194799 master-2 kubenswrapper[4762]: I1014 13:08:27.194675 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:08:27.204394 master-2 kubenswrapper[4762]: I1014 13:08:27.204228 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:08:27.204896 master-2 kubenswrapper[4762]: I1014 13:08:27.204850 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:08:27.246526 master-2 kubenswrapper[4762]: I1014 13:08:27.246423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246526 master-2 kubenswrapper[4762]: I1014 13:08:27.246508 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246548 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jcg\" (UniqueName: \"kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246658 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246713 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246746 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.246827 master-2 kubenswrapper[4762]: I1014 13:08:27.246813 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.247047 master-2 kubenswrapper[4762]: I1014 13:08:27.246844 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.247047 master-2 kubenswrapper[4762]: I1014 13:08:27.246876 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.247047 master-2 kubenswrapper[4762]: I1014 13:08:27.246911 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.247047 master-2 kubenswrapper[4762]: I1014 13:08:27.246966 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be28f050-844b-4865-b27b-d724a630773d-audit\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:27.347520 master-2 kubenswrapper[4762]: I1014 13:08:27.347439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.347520 master-2 kubenswrapper[4762]: I1014 13:08:27.347519 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.347853 master-2 kubenswrapper[4762]: I1014 13:08:27.347583 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.347853 master-2 kubenswrapper[4762]: I1014 13:08:27.347624 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.347853 master-2 kubenswrapper[4762]: I1014 13:08:27.347654 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.347853 master-2 kubenswrapper[4762]: I1014 13:08:27.347699 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.348475 master-2 kubenswrapper[4762]: I1014 13:08:27.348419 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.348775 master-2 kubenswrapper[4762]: I1014 13:08:27.348713 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349563 master-2 kubenswrapper[4762]: I1014 13:08:27.349087 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349563 master-2 kubenswrapper[4762]: I1014 13:08:27.349126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349563 master-2 kubenswrapper[4762]: I1014 13:08:27.349505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349807 master-2 kubenswrapper[4762]: I1014 13:08:27.349054 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349843 master-2 kubenswrapper[4762]: I1014 13:08:27.349808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349876 master-2 kubenswrapper[4762]: I1014 13:08:27.349849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.349911 master-2 kubenswrapper[4762]: I1014 13:08:27.349887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jcg\" (UniqueName: \"kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.350010 master-2 kubenswrapper[4762]: I1014 13:08:27.349993 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.350205 master-2 kubenswrapper[4762]: I1014 13:08:27.350137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.350319 master-2 kubenswrapper[4762]: I1014 13:08:27.350276 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.352034 master-2 kubenswrapper[4762]: I1014 13:08:27.351980 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.352643 master-2 kubenswrapper[4762]: I1014 13:08:27.352586 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.354341 master-2 kubenswrapper[4762]: I1014 13:08:27.354298 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.380545 master-2 kubenswrapper[4762]: I1014 13:08:27.380496 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jcg\" (UniqueName: \"kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg\") pod \"apiserver-6576f6bc9d-r2fhv\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.508634 master-2 kubenswrapper[4762]: I1014 13:08:27.508540 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:27.556092 master-2 kubenswrapper[4762]: I1014 13:08:27.555999 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be28f050-844b-4865-b27b-d724a630773d" path="/var/lib/kubelet/pods/be28f050-844b-4865-b27b-d724a630773d/volumes" Oct 14 13:08:27.754094 master-2 kubenswrapper[4762]: I1014 13:08:27.754027 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:08:27.765016 master-2 kubenswrapper[4762]: W1014 13:08:27.764924 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3964407d_3235_4331_bee0_0188f908f6c8.slice/crio-716c3ce129ce2932aea7c7fb0982af4766d73a0a044009d44c9d5ac1ac0033e5 WatchSource:0}: Error finding container 716c3ce129ce2932aea7c7fb0982af4766d73a0a044009d44c9d5ac1ac0033e5: Status 404 returned error can't find the container with id 716c3ce129ce2932aea7c7fb0982af4766d73a0a044009d44c9d5ac1ac0033e5 Oct 14 13:08:28.151748 master-2 kubenswrapper[4762]: I1014 13:08:28.151597 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerStarted","Data":"716c3ce129ce2932aea7c7fb0982af4766d73a0a044009d44c9d5ac1ac0033e5"} Oct 14 13:08:28.560084 master-2 kubenswrapper[4762]: I1014 13:08:28.559950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:08:28.564063 master-2 kubenswrapper[4762]: I1014 13:08:28.564005 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 14 13:08:28.573845 master-2 kubenswrapper[4762]: I1014 13:08:28.573768 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 14 13:08:28.590242 master-2 kubenswrapper[4762]: I1014 13:08:28.590134 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spkpp\" (UniqueName: \"kubernetes.io/projected/f841b8dd-c459-4e20-b11a-9169905ad069-kube-api-access-spkpp\") pod \"network-check-target-cb5bh\" (UID: \"f841b8dd-c459-4e20-b11a-9169905ad069\") " pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:08:28.869311 master-2 kubenswrapper[4762]: I1014 13:08:28.869116 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:08:28.965815 master-2 kubenswrapper[4762]: I1014 13:08:28.965418 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:28.965815 master-2 kubenswrapper[4762]: I1014 13:08:28.965500 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:28.965815 master-2 kubenswrapper[4762]: E1014 13:08:28.965561 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:28.965815 master-2 kubenswrapper[4762]: E1014 13:08:28.965668 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:44.965635767 +0000 UTC m=+154.209794966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:28.968039 master-2 kubenswrapper[4762]: E1014 13:08:28.965909 4762 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Oct 14 13:08:28.968039 master-2 kubenswrapper[4762]: E1014 13:08:28.966038 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:08:44.966006789 +0000 UTC m=+154.210166008 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : secret "serving-cert" not found Oct 14 13:08:29.107099 master-2 kubenswrapper[4762]: I1014 13:08:29.107014 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cb5bh"] Oct 14 13:08:29.120127 master-2 kubenswrapper[4762]: W1014 13:08:29.119973 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf841b8dd_c459_4e20_b11a_9169905ad069.slice/crio-be30e7efd47df12345372d047aa0e256abaf0c8b6d2f48379b369384453eb3cf WatchSource:0}: Error finding container be30e7efd47df12345372d047aa0e256abaf0c8b6d2f48379b369384453eb3cf: Status 404 returned error can't find the container with id be30e7efd47df12345372d047aa0e256abaf0c8b6d2f48379b369384453eb3cf Oct 14 13:08:29.155959 master-2 kubenswrapper[4762]: I1014 13:08:29.155868 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cb5bh" event={"ID":"f841b8dd-c459-4e20-b11a-9169905ad069","Type":"ContainerStarted","Data":"be30e7efd47df12345372d047aa0e256abaf0c8b6d2f48379b369384453eb3cf"} Oct 14 13:08:29.976704 master-2 kubenswrapper[4762]: I1014 13:08:29.975901 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:29.976704 master-2 kubenswrapper[4762]: I1014 13:08:29.976019 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:29.976704 master-2 kubenswrapper[4762]: E1014 13:08:29.976180 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:29.976704 master-2 kubenswrapper[4762]: E1014 13:08:29.976278 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca podName:fbaa896c-d9b2-45a7-977d-586369aff053 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:45.976251376 +0000 UTC m=+155.220410535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca") pod "controller-manager-6687f866cc-2f4dq" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053") : configmap "client-ca" not found Oct 14 13:08:29.981951 master-2 kubenswrapper[4762]: I1014 13:08:29.981896 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"controller-manager-6687f866cc-2f4dq\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:31.381368 master-2 kubenswrapper[4762]: I1014 13:08:31.381081 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6687f866cc-2f4dq"] Oct 14 13:08:31.381901 master-2 kubenswrapper[4762]: E1014 13:08:31.381503 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" podUID="fbaa896c-d9b2-45a7-977d-586369aff053" Oct 14 13:08:32.173555 master-2 kubenswrapper[4762]: I1014 13:08:32.172362 4762 generic.go:334] "Generic (PLEG): container finished" podID="3964407d-3235-4331-bee0-0188f908f6c8" containerID="a3124751acf39ab26e6f4f85e9d803d6e71bc252c303566e819e99a1f9bf1afc" exitCode=0 Oct 14 13:08:32.173555 master-2 kubenswrapper[4762]: I1014 13:08:32.172428 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerDied","Data":"a3124751acf39ab26e6f4f85e9d803d6e71bc252c303566e819e99a1f9bf1afc"} Oct 14 13:08:32.173555 master-2 kubenswrapper[4762]: I1014 13:08:32.172492 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:32.182283 master-2 kubenswrapper[4762]: I1014 13:08:32.182137 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:32.301455 master-2 kubenswrapper[4762]: I1014 13:08:32.301392 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") pod \"fbaa896c-d9b2-45a7-977d-586369aff053\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " Oct 14 13:08:32.301686 master-2 kubenswrapper[4762]: I1014 13:08:32.301561 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwqlf\" (UniqueName: \"kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf\") pod \"fbaa896c-d9b2-45a7-977d-586369aff053\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " Oct 14 13:08:32.301686 master-2 kubenswrapper[4762]: I1014 13:08:32.301622 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles\") pod \"fbaa896c-d9b2-45a7-977d-586369aff053\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " Oct 14 13:08:32.301748 master-2 kubenswrapper[4762]: I1014 13:08:32.301697 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config\") pod \"fbaa896c-d9b2-45a7-977d-586369aff053\" (UID: \"fbaa896c-d9b2-45a7-977d-586369aff053\") " Oct 14 13:08:32.303080 master-2 kubenswrapper[4762]: I1014 13:08:32.302604 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config" (OuterVolumeSpecName: "config") pod "fbaa896c-d9b2-45a7-977d-586369aff053" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:32.304051 master-2 kubenswrapper[4762]: I1014 13:08:32.304006 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fbaa896c-d9b2-45a7-977d-586369aff053" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:32.305990 master-2 kubenswrapper[4762]: I1014 13:08:32.305932 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fbaa896c-d9b2-45a7-977d-586369aff053" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:08:32.306224 master-2 kubenswrapper[4762]: I1014 13:08:32.306118 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf" (OuterVolumeSpecName: "kube-api-access-vwqlf") pod "fbaa896c-d9b2-45a7-977d-586369aff053" (UID: "fbaa896c-d9b2-45a7-977d-586369aff053"). InnerVolumeSpecName "kube-api-access-vwqlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:08:32.402961 master-2 kubenswrapper[4762]: I1014 13:08:32.402902 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbaa896c-d9b2-45a7-977d-586369aff053-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:32.402961 master-2 kubenswrapper[4762]: I1014 13:08:32.402938 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwqlf\" (UniqueName: \"kubernetes.io/projected/fbaa896c-d9b2-45a7-977d-586369aff053-kube-api-access-vwqlf\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:32.402961 master-2 kubenswrapper[4762]: I1014 13:08:32.402950 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-proxy-ca-bundles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:32.402961 master-2 kubenswrapper[4762]: I1014 13:08:32.402959 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:32.548623 master-2 kubenswrapper[4762]: I1014 13:08:32.548447 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:08:32.548889 master-2 kubenswrapper[4762]: E1014 13:08:32.548630 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:08:33.175956 master-2 kubenswrapper[4762]: I1014 13:08:33.175912 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6687f866cc-2f4dq" Oct 14 13:08:33.212419 master-2 kubenswrapper[4762]: I1014 13:08:33.212348 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6687f866cc-2f4dq"] Oct 14 13:08:33.215569 master-2 kubenswrapper[4762]: I1014 13:08:33.215010 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55bcd8787f-4krnt"] Oct 14 13:08:33.215569 master-2 kubenswrapper[4762]: I1014 13:08:33.215445 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.217354 master-2 kubenswrapper[4762]: I1014 13:08:33.216487 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6687f866cc-2f4dq"] Oct 14 13:08:33.219810 master-2 kubenswrapper[4762]: I1014 13:08:33.219780 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:08:33.219880 master-2 kubenswrapper[4762]: I1014 13:08:33.219821 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:08:33.219932 master-2 kubenswrapper[4762]: I1014 13:08:33.219873 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:08:33.220009 master-2 kubenswrapper[4762]: I1014 13:08:33.219987 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:08:33.220271 master-2 kubenswrapper[4762]: I1014 13:08:33.220206 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:08:33.225551 master-2 kubenswrapper[4762]: I1014 13:08:33.224993 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55bcd8787f-4krnt"] Oct 14 13:08:33.228545 master-2 kubenswrapper[4762]: I1014 13:08:33.228380 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310451 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310494 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcjc\" (UniqueName: \"kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310738 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.311253 master-2 kubenswrapper[4762]: I1014 13:08:33.310798 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbaa896c-d9b2-45a7-977d-586369aff053-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: I1014 13:08:33.411830 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: I1014 13:08:33.411893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: I1014 13:08:33.411959 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: I1014 13:08:33.412014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: E1014 13:08:33.412068 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: I1014 13:08:33.412090 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcjc\" (UniqueName: \"kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.414097 master-2 kubenswrapper[4762]: E1014 13:08:33.412195 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:33.912143456 +0000 UTC m=+143.156302645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:33.422182 master-2 kubenswrapper[4762]: I1014 13:08:33.417366 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.422182 master-2 kubenswrapper[4762]: I1014 13:08:33.417527 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.422182 master-2 kubenswrapper[4762]: I1014 13:08:33.418379 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.443458 master-2 kubenswrapper[4762]: I1014 13:08:33.443364 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcjc\" (UniqueName: \"kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.552946 master-2 kubenswrapper[4762]: I1014 13:08:33.552859 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbaa896c-d9b2-45a7-977d-586369aff053" path="/var/lib/kubelet/pods/fbaa896c-d9b2-45a7-977d-586369aff053/volumes" Oct 14 13:08:33.924118 master-2 kubenswrapper[4762]: I1014 13:08:33.923685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:33.924336 master-2 kubenswrapper[4762]: E1014 13:08:33.923936 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:33.924370 master-2 kubenswrapper[4762]: E1014 13:08:33.924341 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:34.924299554 +0000 UTC m=+144.168458763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:34.201532 master-2 kubenswrapper[4762]: I1014 13:08:34.201277 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerStarted","Data":"21f61bbd0a679861d2b7a35cb7734379d280969386c988ae04b5b4ff4b64d191"} Oct 14 13:08:34.203910 master-2 kubenswrapper[4762]: I1014 13:08:34.203862 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cb5bh" event={"ID":"f841b8dd-c459-4e20-b11a-9169905ad069","Type":"ContainerStarted","Data":"b7cc9d11494c2c51609e966b2ff1eb476da294c6eb0ae09f2ad85fc230da3a9f"} Oct 14 13:08:34.204941 master-2 kubenswrapper[4762]: I1014 13:08:34.204514 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:08:34.227124 master-2 kubenswrapper[4762]: I1014 13:08:34.218203 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cb5bh" podStartSLOduration=66.060383686 podStartE2EDuration="1m10.218139826s" podCreationTimestamp="2025-10-14 13:07:24 +0000 UTC" firstStartedPulling="2025-10-14 13:08:29.122577637 +0000 UTC m=+138.366736826" lastFinishedPulling="2025-10-14 13:08:33.280333797 +0000 UTC m=+142.524492966" observedRunningTime="2025-10-14 13:08:34.217923939 +0000 UTC m=+143.462083128" watchObservedRunningTime="2025-10-14 13:08:34.218139826 +0000 UTC m=+143.462299015" Oct 14 13:08:34.932136 master-2 kubenswrapper[4762]: I1014 13:08:34.932077 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:34.932991 master-2 kubenswrapper[4762]: E1014 13:08:34.932275 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:34.932991 master-2 kubenswrapper[4762]: E1014 13:08:34.932401 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:36.932374362 +0000 UTC m=+146.176533551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:35.460516 master-2 kubenswrapper[4762]: E1014 13:08:35.460401 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" podUID="d0bf2b14-2719-4b1b-a661-fbf4d27c05dc" Oct 14 13:08:36.216196 master-2 kubenswrapper[4762]: I1014 13:08:36.216074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerStarted","Data":"a08f9650be2e2e77d06d19aff6edcc8568dc365457f3253809f222a206d4e2e8"} Oct 14 13:08:36.953364 master-2 kubenswrapper[4762]: I1014 13:08:36.953263 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:36.953605 master-2 kubenswrapper[4762]: E1014 13:08:36.953459 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:36.953605 master-2 kubenswrapper[4762]: E1014 13:08:36.953520 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:40.953501417 +0000 UTC m=+150.197660576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:37.509557 master-2 kubenswrapper[4762]: I1014 13:08:37.509451 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:37.510565 master-2 kubenswrapper[4762]: I1014 13:08:37.509585 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:37.519877 master-2 kubenswrapper[4762]: I1014 13:08:37.519826 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:37.539970 master-2 kubenswrapper[4762]: I1014 13:08:37.539851 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podStartSLOduration=3.483047012 podStartE2EDuration="11.539833678s" podCreationTimestamp="2025-10-14 13:08:26 +0000 UTC" firstStartedPulling="2025-10-14 13:08:27.768117813 +0000 UTC m=+137.012277012" lastFinishedPulling="2025-10-14 13:08:35.824904479 +0000 UTC m=+145.069063678" observedRunningTime="2025-10-14 13:08:36.239281052 +0000 UTC m=+145.483440251" watchObservedRunningTime="2025-10-14 13:08:37.539833678 +0000 UTC m=+146.783992877" Oct 14 13:08:38.234215 master-2 kubenswrapper[4762]: I1014 13:08:38.234104 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:08:40.388697 master-2 kubenswrapper[4762]: I1014 13:08:40.388578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:08:40.394874 master-2 kubenswrapper[4762]: I1014 13:08:40.394829 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bf2b14-2719-4b1b-a661-fbf4d27c05dc-serving-cert\") pod \"cluster-version-operator-55bd67947c-872k9\" (UID: \"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc\") " pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:08:40.995791 master-2 kubenswrapper[4762]: I1014 13:08:40.995638 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:40.995791 master-2 kubenswrapper[4762]: E1014 13:08:40.995820 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:40.996518 master-2 kubenswrapper[4762]: E1014 13:08:40.995993 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:48.995914173 +0000 UTC m=+158.240073362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:44.548518 master-2 kubenswrapper[4762]: I1014 13:08:44.548469 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:08:44.549621 master-2 kubenswrapper[4762]: E1014 13:08:44.549558 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:08:45.037665 master-2 kubenswrapper[4762]: I1014 13:08:45.037069 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:45.037665 master-2 kubenswrapper[4762]: I1014 13:08:45.037143 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:45.038215 master-2 kubenswrapper[4762]: E1014 13:08:45.038097 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:45.038308 master-2 kubenswrapper[4762]: E1014 13:08:45.038287 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca podName:2c2148c9-9f2d-420c-9a2e-6433f4885d5d nodeName:}" failed. No retries permitted until 2025-10-14 13:09:17.038251546 +0000 UTC m=+186.282410735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca") pod "route-controller-manager-7f89f9db8c-dx7pm" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d") : configmap "client-ca" not found Oct 14 13:08:45.042225 master-2 kubenswrapper[4762]: I1014 13:08:45.042129 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"route-controller-manager-7f89f9db8c-dx7pm\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:46.547763 master-2 kubenswrapper[4762]: I1014 13:08:46.547669 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:08:46.548418 master-2 kubenswrapper[4762]: I1014 13:08:46.548287 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" Oct 14 13:08:46.570722 master-2 kubenswrapper[4762]: W1014 13:08:46.570624 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bf2b14_2719_4b1b_a661_fbf4d27c05dc.slice/crio-16acbeb82f9015c059054334a95bd3834ca52dd570c1b685a76f29158f7c167a WatchSource:0}: Error finding container 16acbeb82f9015c059054334a95bd3834ca52dd570c1b685a76f29158f7c167a: Status 404 returned error can't find the container with id 16acbeb82f9015c059054334a95bd3834ca52dd570c1b685a76f29158f7c167a Oct 14 13:08:46.683130 master-2 kubenswrapper[4762]: I1014 13:08:46.683021 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-8487p"] Oct 14 13:08:46.683653 master-2 kubenswrapper[4762]: I1014 13:08:46.683597 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.686367 master-2 kubenswrapper[4762]: I1014 13:08:46.686316 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Oct 14 13:08:46.686510 master-2 kubenswrapper[4762]: I1014 13:08:46.686487 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Oct 14 13:08:46.754800 master-2 kubenswrapper[4762]: I1014 13:08:46.754703 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-var-lib-kubelet\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.754800 master-2 kubenswrapper[4762]: I1014 13:08:46.754761 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-run\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.754871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-lib-modules\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.754928 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgr6n\" (UniqueName: \"kubernetes.io/projected/93952b70-8f99-4679-9d54-1a9143ed164b-kube-api-access-mgr6n\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.754970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.755007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-tmp\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.755037 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-host\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.755068 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-conf\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.755108 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-sys\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755136 master-2 kubenswrapper[4762]: I1014 13:08:46.755146 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysconfig\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.755677 master-2 kubenswrapper[4762]: I1014 13:08:46.755192 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-kubernetes\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.797472 master-2 kubenswrapper[4762]: I1014 13:08:46.756790 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-etc-tuned\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.797472 master-2 kubenswrapper[4762]: I1014 13:08:46.756932 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-modprobe-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.797472 master-2 kubenswrapper[4762]: I1014 13:08:46.757009 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-systemd\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.857842 master-2 kubenswrapper[4762]: I1014 13:08:46.857652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-tmp\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.857842 master-2 kubenswrapper[4762]: I1014 13:08:46.857771 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-sys\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.857842 master-2 kubenswrapper[4762]: I1014 13:08:46.857823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-host\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858134 master-2 kubenswrapper[4762]: I1014 13:08:46.857868 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-conf\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858134 master-2 kubenswrapper[4762]: I1014 13:08:46.857918 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysconfig\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858134 master-2 kubenswrapper[4762]: I1014 13:08:46.857967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-kubernetes\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858134 master-2 kubenswrapper[4762]: I1014 13:08:46.858030 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-sys\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858134 master-2 kubenswrapper[4762]: I1014 13:08:46.858056 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-etc-tuned\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-modprobe-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858252 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-systemd\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858318 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-var-lib-kubelet\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858356 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-run\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858398 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-lib-modules\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr6n\" (UniqueName: \"kubernetes.io/projected/93952b70-8f99-4679-9d54-1a9143ed164b-kube-api-access-mgr6n\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858452 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysconfig\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858461 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-modprobe-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.858502 master-2 kubenswrapper[4762]: I1014 13:08:46.858433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-host\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-var-lib-kubelet\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858574 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-run\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858581 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-conf\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858617 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-systemd\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-kubernetes\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858646 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-etc-sysctl-d\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.859616 master-2 kubenswrapper[4762]: I1014 13:08:46.858741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93952b70-8f99-4679-9d54-1a9143ed164b-lib-modules\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.862609 master-2 kubenswrapper[4762]: I1014 13:08:46.862527 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-tmp\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.863080 master-2 kubenswrapper[4762]: I1014 13:08:46.863008 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93952b70-8f99-4679-9d54-1a9143ed164b-etc-tuned\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.875499 master-2 kubenswrapper[4762]: I1014 13:08:46.875448 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr6n\" (UniqueName: \"kubernetes.io/projected/93952b70-8f99-4679-9d54-1a9143ed164b-kube-api-access-mgr6n\") pod \"tuned-8487p\" (UID: \"93952b70-8f99-4679-9d54-1a9143ed164b\") " pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:46.995708 master-2 kubenswrapper[4762]: I1014 13:08:46.995618 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8487p" Oct 14 13:08:47.011970 master-2 kubenswrapper[4762]: W1014 13:08:47.011543 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93952b70_8f99_4679_9d54_1a9143ed164b.slice/crio-329b918d97e478e383b9de3b0ad1fa719424ab6b8b2c9158b72f2c7fc8b87dbe WatchSource:0}: Error finding container 329b918d97e478e383b9de3b0ad1fa719424ab6b8b2c9158b72f2c7fc8b87dbe: Status 404 returned error can't find the container with id 329b918d97e478e383b9de3b0ad1fa719424ab6b8b2c9158b72f2c7fc8b87dbe Oct 14 13:08:47.182390 master-2 kubenswrapper[4762]: I1014 13:08:47.182284 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pbtld"] Oct 14 13:08:47.183341 master-2 kubenswrapper[4762]: I1014 13:08:47.183087 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.186952 master-2 kubenswrapper[4762]: I1014 13:08:47.185804 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 14 13:08:47.186952 master-2 kubenswrapper[4762]: I1014 13:08:47.185824 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 14 13:08:47.186952 master-2 kubenswrapper[4762]: I1014 13:08:47.185835 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 14 13:08:47.186952 master-2 kubenswrapper[4762]: I1014 13:08:47.186188 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 14 13:08:47.191303 master-2 kubenswrapper[4762]: I1014 13:08:47.189543 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pbtld"] Oct 14 13:08:47.258579 master-2 kubenswrapper[4762]: I1014 13:08:47.258477 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8487p" event={"ID":"93952b70-8f99-4679-9d54-1a9143ed164b","Type":"ContainerStarted","Data":"329b918d97e478e383b9de3b0ad1fa719424ab6b8b2c9158b72f2c7fc8b87dbe"} Oct 14 13:08:47.260053 master-2 kubenswrapper[4762]: I1014 13:08:47.259927 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" event={"ID":"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc","Type":"ContainerStarted","Data":"16acbeb82f9015c059054334a95bd3834ca52dd570c1b685a76f29158f7c167a"} Oct 14 13:08:47.262721 master-2 kubenswrapper[4762]: I1014 13:08:47.262644 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kr7\" (UniqueName: \"kubernetes.io/projected/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-kube-api-access-r9kr7\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.263007 master-2 kubenswrapper[4762]: I1014 13:08:47.262749 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-config-volume\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.263007 master-2 kubenswrapper[4762]: I1014 13:08:47.262792 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-metrics-tls\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.364212 master-2 kubenswrapper[4762]: I1014 13:08:47.364115 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-metrics-tls\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.364387 master-2 kubenswrapper[4762]: I1014 13:08:47.364228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kr7\" (UniqueName: \"kubernetes.io/projected/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-kube-api-access-r9kr7\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.364387 master-2 kubenswrapper[4762]: I1014 13:08:47.364337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-config-volume\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.365310 master-2 kubenswrapper[4762]: I1014 13:08:47.365281 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-config-volume\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.369205 master-2 kubenswrapper[4762]: I1014 13:08:47.369169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-metrics-tls\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.387090 master-2 kubenswrapper[4762]: I1014 13:08:47.387051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kr7\" (UniqueName: \"kubernetes.io/projected/1a953a2b-9bc4-485a-9daf-f6e9b84d493a-kube-api-access-r9kr7\") pod \"dns-default-pbtld\" (UID: \"1a953a2b-9bc4-485a-9daf-f6e9b84d493a\") " pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.500874 master-2 kubenswrapper[4762]: I1014 13:08:47.500750 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6rrjr"] Oct 14 13:08:47.501483 master-2 kubenswrapper[4762]: I1014 13:08:47.501452 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.509563 master-2 kubenswrapper[4762]: I1014 13:08:47.509533 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:47.567238 master-2 kubenswrapper[4762]: I1014 13:08:47.567149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-hosts-file\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.567843 master-2 kubenswrapper[4762]: I1014 13:08:47.567383 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjjs\" (UniqueName: \"kubernetes.io/projected/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-kube-api-access-cbjjs\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.668899 master-2 kubenswrapper[4762]: I1014 13:08:47.668850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjjs\" (UniqueName: \"kubernetes.io/projected/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-kube-api-access-cbjjs\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.669112 master-2 kubenswrapper[4762]: I1014 13:08:47.668941 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-hosts-file\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.669184 master-2 kubenswrapper[4762]: I1014 13:08:47.669128 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-hosts-file\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.685189 master-2 kubenswrapper[4762]: I1014 13:08:47.685130 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjjs\" (UniqueName: \"kubernetes.io/projected/d4509bb5-afbe-43d9-bfe9-32cd7f55257d-kube-api-access-cbjjs\") pod \"node-resolver-6rrjr\" (UID: \"d4509bb5-afbe-43d9-bfe9-32cd7f55257d\") " pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:47.709842 master-2 kubenswrapper[4762]: I1014 13:08:47.709698 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pbtld"] Oct 14 13:08:47.720819 master-2 kubenswrapper[4762]: W1014 13:08:47.720779 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a953a2b_9bc4_485a_9daf_f6e9b84d493a.slice/crio-b2040bd5a2e7f841ff33f9c28f3ff66dc5166a24d0b979538f26537ea5192ab8 WatchSource:0}: Error finding container b2040bd5a2e7f841ff33f9c28f3ff66dc5166a24d0b979538f26537ea5192ab8: Status 404 returned error can't find the container with id b2040bd5a2e7f841ff33f9c28f3ff66dc5166a24d0b979538f26537ea5192ab8 Oct 14 13:08:47.814454 master-2 kubenswrapper[4762]: I1014 13:08:47.814317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rrjr" Oct 14 13:08:48.267111 master-2 kubenswrapper[4762]: I1014 13:08:48.267000 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rrjr" event={"ID":"d4509bb5-afbe-43d9-bfe9-32cd7f55257d","Type":"ContainerStarted","Data":"3ca3248c52f282c5de9ecf4c468f72721a28262fee6e63acf23a30715460cb32"} Oct 14 13:08:48.267409 master-2 kubenswrapper[4762]: I1014 13:08:48.267117 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rrjr" event={"ID":"d4509bb5-afbe-43d9-bfe9-32cd7f55257d","Type":"ContainerStarted","Data":"0dc63372b57e44590d669cf9625783f4213e4b67e8c942752658a5627e223122"} Oct 14 13:08:48.268648 master-2 kubenswrapper[4762]: I1014 13:08:48.268593 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbtld" event={"ID":"1a953a2b-9bc4-485a-9daf-f6e9b84d493a","Type":"ContainerStarted","Data":"b2040bd5a2e7f841ff33f9c28f3ff66dc5166a24d0b979538f26537ea5192ab8"} Oct 14 13:08:48.280969 master-2 kubenswrapper[4762]: I1014 13:08:48.280876 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6rrjr" podStartSLOduration=1.280862889 podStartE2EDuration="1.280862889s" podCreationTimestamp="2025-10-14 13:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:08:48.280795097 +0000 UTC m=+157.524954326" watchObservedRunningTime="2025-10-14 13:08:48.280862889 +0000 UTC m=+157.525022058" Oct 14 13:08:48.648684 master-2 kubenswrapper[4762]: I1014 13:08:48.647406 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:08:48.648684 master-2 kubenswrapper[4762]: I1014 13:08:48.648073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.651011 master-2 kubenswrapper[4762]: I1014 13:08:48.650944 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 14 13:08:48.652653 master-2 kubenswrapper[4762]: I1014 13:08:48.652373 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 14 13:08:48.652653 master-2 kubenswrapper[4762]: I1014 13:08:48.652651 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 14 13:08:48.652886 master-2 kubenswrapper[4762]: I1014 13:08:48.652816 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 14 13:08:48.652886 master-2 kubenswrapper[4762]: I1014 13:08:48.652844 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 14 13:08:48.653283 master-2 kubenswrapper[4762]: I1014 13:08:48.653234 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 14 13:08:48.655889 master-2 kubenswrapper[4762]: I1014 13:08:48.655868 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 14 13:08:48.656070 master-2 kubenswrapper[4762]: I1014 13:08:48.655872 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:08:48.658466 master-2 kubenswrapper[4762]: I1014 13:08:48.658412 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:08:48.782138 master-2 kubenswrapper[4762]: I1014 13:08:48.782088 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.782443 master-2 kubenswrapper[4762]: I1014 13:08:48.782415 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.782528 master-2 kubenswrapper[4762]: I1014 13:08:48.782516 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.782605 master-2 kubenswrapper[4762]: I1014 13:08:48.782593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.782896 master-2 kubenswrapper[4762]: I1014 13:08:48.782823 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.782988 master-2 kubenswrapper[4762]: I1014 13:08:48.782947 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ntc\" (UniqueName: \"kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.783037 master-2 kubenswrapper[4762]: I1014 13:08:48.783004 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.783087 master-2 kubenswrapper[4762]: I1014 13:08:48.783061 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884474 master-2 kubenswrapper[4762]: I1014 13:08:48.884421 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884572 master-2 kubenswrapper[4762]: I1014 13:08:48.884474 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884572 master-2 kubenswrapper[4762]: I1014 13:08:48.884519 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884572 master-2 kubenswrapper[4762]: I1014 13:08:48.884548 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884689 master-2 kubenswrapper[4762]: I1014 13:08:48.884607 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884689 master-2 kubenswrapper[4762]: I1014 13:08:48.884651 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ntc\" (UniqueName: \"kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884689 master-2 kubenswrapper[4762]: I1014 13:08:48.884675 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884803 master-2 kubenswrapper[4762]: I1014 13:08:48.884708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.884803 master-2 kubenswrapper[4762]: I1014 13:08:48.884781 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.886036 master-2 kubenswrapper[4762]: I1014 13:08:48.885966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.886036 master-2 kubenswrapper[4762]: I1014 13:08:48.885975 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.886707 master-2 kubenswrapper[4762]: I1014 13:08:48.886660 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.890414 master-2 kubenswrapper[4762]: I1014 13:08:48.890371 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.890582 master-2 kubenswrapper[4762]: I1014 13:08:48.890423 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.890708 master-2 kubenswrapper[4762]: I1014 13:08:48.890643 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.906481 master-2 kubenswrapper[4762]: I1014 13:08:48.906345 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ntc\" (UniqueName: \"kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc\") pod \"apiserver-c57444595-mj7cx\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:48.961562 master-2 kubenswrapper[4762]: I1014 13:08:48.961499 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:49.087762 master-2 kubenswrapper[4762]: I1014 13:08:49.087637 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:08:49.088057 master-2 kubenswrapper[4762]: E1014 13:08:49.087806 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:49.088057 master-2 kubenswrapper[4762]: E1014 13:08:49.087856 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:05.08784161 +0000 UTC m=+174.332000769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:08:49.144425 master-2 kubenswrapper[4762]: I1014 13:08:49.144347 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:08:49.150337 master-2 kubenswrapper[4762]: W1014 13:08:49.150279 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b69dba3_5ac1_4eb9_bba6_0d0662ab8544.slice/crio-5182c1b2649ee6beacd2c41c0c7f4de643bea40e0a1caaf204ed40774f6e60be WatchSource:0}: Error finding container 5182c1b2649ee6beacd2c41c0c7f4de643bea40e0a1caaf204ed40774f6e60be: Status 404 returned error can't find the container with id 5182c1b2649ee6beacd2c41c0c7f4de643bea40e0a1caaf204ed40774f6e60be Oct 14 13:08:49.281879 master-2 kubenswrapper[4762]: I1014 13:08:49.281811 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" event={"ID":"d0bf2b14-2719-4b1b-a661-fbf4d27c05dc","Type":"ContainerStarted","Data":"0b0c5a4398f21cb8ca81b7c8e6d7c6ee0519b572aca495ea02f72d32d1d44749"} Oct 14 13:08:49.282978 master-2 kubenswrapper[4762]: I1014 13:08:49.282933 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" event={"ID":"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544","Type":"ContainerStarted","Data":"5182c1b2649ee6beacd2c41c0c7f4de643bea40e0a1caaf204ed40774f6e60be"} Oct 14 13:08:49.297885 master-2 kubenswrapper[4762]: I1014 13:08:49.297801 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-55bd67947c-872k9" podStartSLOduration=140.043465314 podStartE2EDuration="2m22.297781039s" podCreationTimestamp="2025-10-14 13:06:27 +0000 UTC" firstStartedPulling="2025-10-14 13:08:46.574147981 +0000 UTC m=+155.818307180" lastFinishedPulling="2025-10-14 13:08:48.828463726 +0000 UTC m=+158.072622905" observedRunningTime="2025-10-14 13:08:49.297092507 +0000 UTC m=+158.541251676" watchObservedRunningTime="2025-10-14 13:08:49.297781039 +0000 UTC m=+158.541940198" Oct 14 13:08:50.980358 master-2 kubenswrapper[4762]: I1014 13:08:50.980112 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx"] Oct 14 13:08:50.980997 master-2 kubenswrapper[4762]: I1014 13:08:50.980720 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:50.983827 master-2 kubenswrapper[4762]: I1014 13:08:50.983802 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 14 13:08:50.983980 master-2 kubenswrapper[4762]: I1014 13:08:50.983960 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 14 13:08:50.984161 master-2 kubenswrapper[4762]: I1014 13:08:50.984131 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 14 13:08:50.988806 master-2 kubenswrapper[4762]: I1014 13:08:50.988756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx"] Oct 14 13:08:51.106783 master-2 kubenswrapper[4762]: I1014 13:08:51.106664 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr65k\" (UniqueName: \"kubernetes.io/projected/b9eaec93-976f-44ad-b911-2313edf00168-kube-api-access-xr65k\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.106783 master-2 kubenswrapper[4762]: I1014 13:08:51.106724 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9eaec93-976f-44ad-b911-2313edf00168-samples-operator-tls\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.166846 master-2 kubenswrapper[4762]: I1014 13:08:51.166781 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm"] Oct 14 13:08:51.167141 master-2 kubenswrapper[4762]: E1014 13:08:51.167103 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" podUID="2c2148c9-9f2d-420c-9a2e-6433f4885d5d" Oct 14 13:08:51.207691 master-2 kubenswrapper[4762]: I1014 13:08:51.207604 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr65k\" (UniqueName: \"kubernetes.io/projected/b9eaec93-976f-44ad-b911-2313edf00168-kube-api-access-xr65k\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.207861 master-2 kubenswrapper[4762]: I1014 13:08:51.207695 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9eaec93-976f-44ad-b911-2313edf00168-samples-operator-tls\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.214632 master-2 kubenswrapper[4762]: I1014 13:08:51.214576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9eaec93-976f-44ad-b911-2313edf00168-samples-operator-tls\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.232034 master-2 kubenswrapper[4762]: I1014 13:08:51.231943 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr65k\" (UniqueName: \"kubernetes.io/projected/b9eaec93-976f-44ad-b911-2313edf00168-kube-api-access-xr65k\") pod \"cluster-samples-operator-75f9c7d795-v2gmx\" (UID: \"b9eaec93-976f-44ad-b911-2313edf00168\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.290437 master-2 kubenswrapper[4762]: I1014 13:08:51.290394 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:51.298953 master-2 kubenswrapper[4762]: I1014 13:08:51.298921 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:51.300212 master-2 kubenswrapper[4762]: I1014 13:08:51.300184 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" Oct 14 13:08:51.409975 master-2 kubenswrapper[4762]: I1014 13:08:51.409909 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") pod \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " Oct 14 13:08:51.410193 master-2 kubenswrapper[4762]: I1014 13:08:51.409993 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config\") pod \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " Oct 14 13:08:51.410193 master-2 kubenswrapper[4762]: I1014 13:08:51.410037 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvntw\" (UniqueName: \"kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw\") pod \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\" (UID: \"2c2148c9-9f2d-420c-9a2e-6433f4885d5d\") " Oct 14 13:08:51.410982 master-2 kubenswrapper[4762]: I1014 13:08:51.410899 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config" (OuterVolumeSpecName: "config") pod "2c2148c9-9f2d-420c-9a2e-6433f4885d5d" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:08:51.412563 master-2 kubenswrapper[4762]: I1014 13:08:51.412516 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c2148c9-9f2d-420c-9a2e-6433f4885d5d" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:08:51.414075 master-2 kubenswrapper[4762]: I1014 13:08:51.414017 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw" (OuterVolumeSpecName: "kube-api-access-wvntw") pod "2c2148c9-9f2d-420c-9a2e-6433f4885d5d" (UID: "2c2148c9-9f2d-420c-9a2e-6433f4885d5d"). InnerVolumeSpecName "kube-api-access-wvntw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:08:51.511964 master-2 kubenswrapper[4762]: I1014 13:08:51.511730 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:51.511964 master-2 kubenswrapper[4762]: I1014 13:08:51.511786 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvntw\" (UniqueName: \"kubernetes.io/projected/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-kube-api-access-wvntw\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:51.511964 master-2 kubenswrapper[4762]: I1014 13:08:51.511798 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:52.295371 master-2 kubenswrapper[4762]: I1014 13:08:52.294263 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm" Oct 14 13:08:52.334570 master-2 kubenswrapper[4762]: I1014 13:08:52.334434 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv"] Oct 14 13:08:52.334951 master-2 kubenswrapper[4762]: I1014 13:08:52.334925 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.338141 master-2 kubenswrapper[4762]: I1014 13:08:52.337772 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:08:52.339697 master-2 kubenswrapper[4762]: I1014 13:08:52.339640 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:08:52.340088 master-2 kubenswrapper[4762]: I1014 13:08:52.340062 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:08:52.340221 master-2 kubenswrapper[4762]: I1014 13:08:52.340198 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:08:52.340268 master-2 kubenswrapper[4762]: I1014 13:08:52.340237 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm"] Oct 14 13:08:52.340726 master-2 kubenswrapper[4762]: I1014 13:08:52.340642 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:08:52.348294 master-2 kubenswrapper[4762]: I1014 13:08:52.348245 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f89f9db8c-dx7pm"] Oct 14 13:08:52.358687 master-2 kubenswrapper[4762]: I1014 13:08:52.358631 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv"] Oct 14 13:08:52.380468 master-2 kubenswrapper[4762]: I1014 13:08:52.380421 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx"] Oct 14 13:08:52.426203 master-2 kubenswrapper[4762]: I1014 13:08:52.423579 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7rh6\" (UniqueName: \"kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.426203 master-2 kubenswrapper[4762]: I1014 13:08:52.423647 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.426203 master-2 kubenswrapper[4762]: I1014 13:08:52.423709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.426203 master-2 kubenswrapper[4762]: I1014 13:08:52.423740 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.524525 master-2 kubenswrapper[4762]: I1014 13:08:52.524474 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.524525 master-2 kubenswrapper[4762]: I1014 13:08:52.524533 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.524744 master-2 kubenswrapper[4762]: I1014 13:08:52.524562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7rh6\" (UniqueName: \"kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.524744 master-2 kubenswrapper[4762]: I1014 13:08:52.524588 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.524744 master-2 kubenswrapper[4762]: I1014 13:08:52.524633 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2148c9-9f2d-420c-9a2e-6433f4885d5d-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:08:52.524744 master-2 kubenswrapper[4762]: E1014 13:08:52.524671 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:52.524744 master-2 kubenswrapper[4762]: E1014 13:08:52.524734 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:53.024716592 +0000 UTC m=+162.268875751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:08:52.525815 master-2 kubenswrapper[4762]: I1014 13:08:52.525793 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.530378 master-2 kubenswrapper[4762]: I1014 13:08:52.530342 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:52.544649 master-2 kubenswrapper[4762]: I1014 13:08:52.544597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7rh6\" (UniqueName: \"kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:53.030241 master-2 kubenswrapper[4762]: I1014 13:08:53.030091 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:53.030608 master-2 kubenswrapper[4762]: E1014 13:08:53.030299 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:53.030608 master-2 kubenswrapper[4762]: E1014 13:08:53.030401 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:54.030375983 +0000 UTC m=+163.274535182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:08:53.298921 master-2 kubenswrapper[4762]: I1014 13:08:53.298747 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" event={"ID":"b9eaec93-976f-44ad-b911-2313edf00168","Type":"ContainerStarted","Data":"328eaaa32b0ffb8ce9cc6a761e11a2c60675cc16273010e38a29eda1c9b67208"} Oct 14 13:08:53.300579 master-2 kubenswrapper[4762]: I1014 13:08:53.300545 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8487p" event={"ID":"93952b70-8f99-4679-9d54-1a9143ed164b","Type":"ContainerStarted","Data":"77752f98ef7480864bce4f2daab375ece0a5413f07dbd45551b795ae431375da"} Oct 14 13:08:53.303272 master-2 kubenswrapper[4762]: I1014 13:08:53.303235 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbtld" event={"ID":"1a953a2b-9bc4-485a-9daf-f6e9b84d493a","Type":"ContainerStarted","Data":"ad4bf25e6a8c04e92a9bf831bb178f73b6f9fabe6652a859a21906c510b00aee"} Oct 14 13:08:53.303364 master-2 kubenswrapper[4762]: I1014 13:08:53.303272 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbtld" event={"ID":"1a953a2b-9bc4-485a-9daf-f6e9b84d493a","Type":"ContainerStarted","Data":"868fad12aaffc60a1f743df53ae4988e327237b579d4271178b338eeaae303b4"} Oct 14 13:08:53.303463 master-2 kubenswrapper[4762]: I1014 13:08:53.303427 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pbtld" Oct 14 13:08:53.306264 master-2 kubenswrapper[4762]: I1014 13:08:53.306204 4762 generic.go:334] "Generic (PLEG): container finished" podID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerID="c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1" exitCode=0 Oct 14 13:08:53.306342 master-2 kubenswrapper[4762]: I1014 13:08:53.306272 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" event={"ID":"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544","Type":"ContainerDied","Data":"c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1"} Oct 14 13:08:53.316605 master-2 kubenswrapper[4762]: I1014 13:08:53.316532 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8487p" podStartSLOduration=2.145085133 podStartE2EDuration="7.31651043s" podCreationTimestamp="2025-10-14 13:08:46 +0000 UTC" firstStartedPulling="2025-10-14 13:08:47.013927913 +0000 UTC m=+156.258087112" lastFinishedPulling="2025-10-14 13:08:52.18535324 +0000 UTC m=+161.429512409" observedRunningTime="2025-10-14 13:08:53.315702664 +0000 UTC m=+162.559861893" watchObservedRunningTime="2025-10-14 13:08:53.31651043 +0000 UTC m=+162.560669589" Oct 14 13:08:53.351670 master-2 kubenswrapper[4762]: I1014 13:08:53.351059 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pbtld" podStartSLOduration=1.89355267 podStartE2EDuration="6.35102657s" podCreationTimestamp="2025-10-14 13:08:47 +0000 UTC" firstStartedPulling="2025-10-14 13:08:47.722719816 +0000 UTC m=+156.966878975" lastFinishedPulling="2025-10-14 13:08:52.180193716 +0000 UTC m=+161.424352875" observedRunningTime="2025-10-14 13:08:53.350783542 +0000 UTC m=+162.594942691" watchObservedRunningTime="2025-10-14 13:08:53.35102657 +0000 UTC m=+162.595185769" Oct 14 13:08:53.553066 master-2 kubenswrapper[4762]: I1014 13:08:53.552936 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2148c9-9f2d-420c-9a2e-6433f4885d5d" path="/var/lib/kubelet/pods/2c2148c9-9f2d-420c-9a2e-6433f4885d5d/volumes" Oct 14 13:08:54.040265 master-2 kubenswrapper[4762]: I1014 13:08:54.040202 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:54.040627 master-2 kubenswrapper[4762]: E1014 13:08:54.040335 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:54.040627 master-2 kubenswrapper[4762]: E1014 13:08:54.040436 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:08:56.040415823 +0000 UTC m=+165.284574982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:08:54.312460 master-2 kubenswrapper[4762]: I1014 13:08:54.312310 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" event={"ID":"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544","Type":"ContainerStarted","Data":"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9"} Oct 14 13:08:55.318223 master-2 kubenswrapper[4762]: I1014 13:08:55.318082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" event={"ID":"b9eaec93-976f-44ad-b911-2313edf00168","Type":"ContainerStarted","Data":"16db8bad98bc28ab1320ac3cd49099c462233a5e01f23641629f2e06dc0a244e"} Oct 14 13:08:55.318223 master-2 kubenswrapper[4762]: I1014 13:08:55.318174 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" event={"ID":"b9eaec93-976f-44ad-b911-2313edf00168","Type":"ContainerStarted","Data":"84fd21d9bfea562d553d0fedc6622ab2e072cbafce6419cd516996c3de2c952c"} Oct 14 13:08:55.336186 master-2 kubenswrapper[4762]: I1014 13:08:55.336014 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podStartSLOduration=4.295277262 podStartE2EDuration="7.335986202s" podCreationTimestamp="2025-10-14 13:08:48 +0000 UTC" firstStartedPulling="2025-10-14 13:08:49.153282285 +0000 UTC m=+158.397441434" lastFinishedPulling="2025-10-14 13:08:52.193991195 +0000 UTC m=+161.438150374" observedRunningTime="2025-10-14 13:08:54.334641968 +0000 UTC m=+163.578801127" watchObservedRunningTime="2025-10-14 13:08:55.335986202 +0000 UTC m=+164.580145401" Oct 14 13:08:56.065185 master-2 kubenswrapper[4762]: I1014 13:08:56.065097 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:08:56.065489 master-2 kubenswrapper[4762]: E1014 13:08:56.065308 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:08:56.065489 master-2 kubenswrapper[4762]: E1014 13:08:56.065410 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:00.06538146 +0000 UTC m=+169.309540669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:08:57.548740 master-2 kubenswrapper[4762]: I1014 13:08:57.548663 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:08:57.549222 master-2 kubenswrapper[4762]: E1014 13:08:57.548856 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:08:58.962824 master-2 kubenswrapper[4762]: I1014 13:08:58.962729 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:58.962824 master-2 kubenswrapper[4762]: I1014 13:08:58.962854 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:58.975029 master-2 kubenswrapper[4762]: I1014 13:08:58.974957 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:08:58.996405 master-2 kubenswrapper[4762]: I1014 13:08:58.996302 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-75f9c7d795-v2gmx" podStartSLOduration=6.995381701 podStartE2EDuration="8.996276061s" podCreationTimestamp="2025-10-14 13:08:50 +0000 UTC" firstStartedPulling="2025-10-14 13:08:52.443403442 +0000 UTC m=+161.687562601" lastFinishedPulling="2025-10-14 13:08:54.444297792 +0000 UTC m=+163.688456961" observedRunningTime="2025-10-14 13:08:55.335348591 +0000 UTC m=+164.579507790" watchObservedRunningTime="2025-10-14 13:08:58.996276061 +0000 UTC m=+168.240435260" Oct 14 13:08:59.338581 master-2 kubenswrapper[4762]: I1014 13:08:59.338381 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:09:00.116838 master-2 kubenswrapper[4762]: I1014 13:09:00.116747 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:09:00.117725 master-2 kubenswrapper[4762]: E1014 13:09:00.116933 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:00.117725 master-2 kubenswrapper[4762]: E1014 13:09:00.117060 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:08.117030579 +0000 UTC m=+177.361189768 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:09:02.513322 master-2 kubenswrapper[4762]: I1014 13:09:02.513213 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pbtld" Oct 14 13:09:02.779087 master-2 kubenswrapper[4762]: I1014 13:09:02.778893 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-c57444595-mj7cx_2b69dba3-5ac1-4eb9-bba6-0d0662ab8544/fix-audit-permissions/0.log" Oct 14 13:09:02.985005 master-2 kubenswrapper[4762]: I1014 13:09:02.984911 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-c57444595-mj7cx_2b69dba3-5ac1-4eb9-bba6-0d0662ab8544/oauth-apiserver/0.log" Oct 14 13:09:03.979431 master-2 kubenswrapper[4762]: I1014 13:09:03.979372 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbtld_1a953a2b-9bc4-485a-9daf-f6e9b84d493a/dns/0.log" Oct 14 13:09:04.178289 master-2 kubenswrapper[4762]: I1014 13:09:04.178235 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbtld_1a953a2b-9bc4-485a-9daf-f6e9b84d493a/kube-rbac-proxy/0.log" Oct 14 13:09:04.779385 master-2 kubenswrapper[4762]: I1014 13:09:04.779279 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rrjr_d4509bb5-afbe-43d9-bfe9-32cd7f55257d/dns-node-resolver/0.log" Oct 14 13:09:05.174751 master-2 kubenswrapper[4762]: I1014 13:09:05.174657 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:09:05.175599 master-2 kubenswrapper[4762]: E1014 13:09:05.174861 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:05.175599 master-2 kubenswrapper[4762]: E1014 13:09:05.175016 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:37.174977958 +0000 UTC m=+206.419137157 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:09:08.153843 master-2 kubenswrapper[4762]: I1014 13:09:08.153746 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j26vt"] Oct 14 13:09:08.155031 master-2 kubenswrapper[4762]: I1014 13:09:08.154947 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.158214 master-2 kubenswrapper[4762]: I1014 13:09:08.158138 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 14 13:09:08.158709 master-2 kubenswrapper[4762]: I1014 13:09:08.158674 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 14 13:09:08.158957 master-2 kubenswrapper[4762]: I1014 13:09:08.158900 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 14 13:09:08.159243 master-2 kubenswrapper[4762]: I1014 13:09:08.159014 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 14 13:09:08.207616 master-2 kubenswrapper[4762]: I1014 13:09:08.207493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:09:08.207844 master-2 kubenswrapper[4762]: E1014 13:09:08.207741 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:08.207916 master-2 kubenswrapper[4762]: E1014 13:09:08.207900 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:24.207868009 +0000 UTC m=+193.452027168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:09:08.309286 master-2 kubenswrapper[4762]: I1014 13:09:08.309093 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71961153-de80-4e6f-9f50-9241d3bc008a-mcd-auth-proxy-config\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.309286 master-2 kubenswrapper[4762]: I1014 13:09:08.309239 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71961153-de80-4e6f-9f50-9241d3bc008a-proxy-tls\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.309742 master-2 kubenswrapper[4762]: I1014 13:09:08.309325 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxfw6\" (UniqueName: \"kubernetes.io/projected/71961153-de80-4e6f-9f50-9241d3bc008a-kube-api-access-rxfw6\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.309742 master-2 kubenswrapper[4762]: I1014 13:09:08.309556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/71961153-de80-4e6f-9f50-9241d3bc008a-rootfs\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.410914 master-2 kubenswrapper[4762]: I1014 13:09:08.410627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71961153-de80-4e6f-9f50-9241d3bc008a-mcd-auth-proxy-config\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.410914 master-2 kubenswrapper[4762]: I1014 13:09:08.410727 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71961153-de80-4e6f-9f50-9241d3bc008a-proxy-tls\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.410914 master-2 kubenswrapper[4762]: I1014 13:09:08.410814 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxfw6\" (UniqueName: \"kubernetes.io/projected/71961153-de80-4e6f-9f50-9241d3bc008a-kube-api-access-rxfw6\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.410914 master-2 kubenswrapper[4762]: I1014 13:09:08.410920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/71961153-de80-4e6f-9f50-9241d3bc008a-rootfs\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.411666 master-2 kubenswrapper[4762]: I1014 13:09:08.411086 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/71961153-de80-4e6f-9f50-9241d3bc008a-rootfs\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.412134 master-2 kubenswrapper[4762]: I1014 13:09:08.412049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71961153-de80-4e6f-9f50-9241d3bc008a-mcd-auth-proxy-config\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.416860 master-2 kubenswrapper[4762]: I1014 13:09:08.416782 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71961153-de80-4e6f-9f50-9241d3bc008a-proxy-tls\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.492871 master-2 kubenswrapper[4762]: I1014 13:09:08.492732 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxfw6\" (UniqueName: \"kubernetes.io/projected/71961153-de80-4e6f-9f50-9241d3bc008a-kube-api-access-rxfw6\") pod \"machine-config-daemon-j26vt\" (UID: \"71961153-de80-4e6f-9f50-9241d3bc008a\") " pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.777552 master-2 kubenswrapper[4762]: I1014 13:09:08.777331 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" Oct 14 13:09:08.798404 master-2 kubenswrapper[4762]: W1014 13:09:08.798321 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71961153_de80_4e6f_9f50_9241d3bc008a.slice/crio-465bef276805528a6cec5a91a53652501207991ed867f1e0981c9ade1729cd5c WatchSource:0}: Error finding container 465bef276805528a6cec5a91a53652501207991ed867f1e0981c9ade1729cd5c: Status 404 returned error can't find the container with id 465bef276805528a6cec5a91a53652501207991ed867f1e0981c9ade1729cd5c Oct 14 13:09:08.874662 master-2 kubenswrapper[4762]: I1014 13:09:08.874604 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cb5bh" Oct 14 13:09:09.372804 master-2 kubenswrapper[4762]: I1014 13:09:09.372723 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" event={"ID":"71961153-de80-4e6f-9f50-9241d3bc008a","Type":"ContainerStarted","Data":"9d431f3c1cd592f9ad663b73dc5938fb9ece95545689671a3931b0ea7b3e1d8b"} Oct 14 13:09:09.372804 master-2 kubenswrapper[4762]: I1014 13:09:09.372796 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" event={"ID":"71961153-de80-4e6f-9f50-9241d3bc008a","Type":"ContainerStarted","Data":"eb4f856237398fe01c5b3a99e103745d15d90dccc6403c7d5dfc46c03dafd6dc"} Oct 14 13:09:09.373654 master-2 kubenswrapper[4762]: I1014 13:09:09.372817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" event={"ID":"71961153-de80-4e6f-9f50-9241d3bc008a","Type":"ContainerStarted","Data":"465bef276805528a6cec5a91a53652501207991ed867f1e0981c9ade1729cd5c"} Oct 14 13:09:09.392089 master-2 kubenswrapper[4762]: I1014 13:09:09.391978 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j26vt" podStartSLOduration=1.391952235 podStartE2EDuration="1.391952235s" podCreationTimestamp="2025-10-14 13:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:09.391636565 +0000 UTC m=+178.635795754" watchObservedRunningTime="2025-10-14 13:09:09.391952235 +0000 UTC m=+178.636111434" Oct 14 13:09:09.778491 master-2 kubenswrapper[4762]: I1014 13:09:09.778387 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/fix-audit-permissions/0.log" Oct 14 13:09:09.984188 master-2 kubenswrapper[4762]: I1014 13:09:09.984086 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/openshift-apiserver/0.log" Oct 14 13:09:10.179583 master-2 kubenswrapper[4762]: I1014 13:09:10.179484 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/openshift-apiserver-check-endpoints/0.log" Oct 14 13:09:10.548362 master-2 kubenswrapper[4762]: I1014 13:09:10.548196 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:09:10.549659 master-2 kubenswrapper[4762]: E1014 13:09:10.548503 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:09:12.385845 master-2 kubenswrapper[4762]: I1014 13:09:12.385668 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x"] Oct 14 13:09:12.386474 master-2 kubenswrapper[4762]: I1014 13:09:12.386393 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.390744 master-2 kubenswrapper[4762]: I1014 13:09:12.390703 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 14 13:09:12.394284 master-2 kubenswrapper[4762]: I1014 13:09:12.394254 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x"] Oct 14 13:09:12.454982 master-2 kubenswrapper[4762]: I1014 13:09:12.454892 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82s9\" (UniqueName: \"kubernetes.io/projected/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-kube-api-access-h82s9\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.455327 master-2 kubenswrapper[4762]: I1014 13:09:12.455087 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-proxy-tls\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.455327 master-2 kubenswrapper[4762]: I1014 13:09:12.455292 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-mcc-auth-proxy-config\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.557273 master-2 kubenswrapper[4762]: I1014 13:09:12.556975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-mcc-auth-proxy-config\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.557273 master-2 kubenswrapper[4762]: I1014 13:09:12.557146 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82s9\" (UniqueName: \"kubernetes.io/projected/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-kube-api-access-h82s9\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.557564 master-2 kubenswrapper[4762]: I1014 13:09:12.557362 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-proxy-tls\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.562506 master-2 kubenswrapper[4762]: I1014 13:09:12.562434 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-mcc-auth-proxy-config\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.563271 master-2 kubenswrapper[4762]: I1014 13:09:12.563214 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-proxy-tls\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.590306 master-2 kubenswrapper[4762]: I1014 13:09:12.590188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82s9\" (UniqueName: \"kubernetes.io/projected/51b5c3f6-411f-499b-be0f-4d1f484c4c3c-kube-api-access-h82s9\") pod \"machine-config-controller-6dcc7bf8f6-s7t2x\" (UID: \"51b5c3f6-411f-499b-be0f-4d1f484c4c3c\") " pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:12.709558 master-2 kubenswrapper[4762]: I1014 13:09:12.709481 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" Oct 14 13:09:13.122652 master-2 kubenswrapper[4762]: I1014 13:09:13.122392 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x"] Oct 14 13:09:13.135436 master-2 kubenswrapper[4762]: W1014 13:09:13.135367 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b5c3f6_411f_499b_be0f_4d1f484c4c3c.slice/crio-626c224740f821b8471e8fde711a0974fe7a1cbc375f9451ebc98d0e712dd0fd WatchSource:0}: Error finding container 626c224740f821b8471e8fde711a0974fe7a1cbc375f9451ebc98d0e712dd0fd: Status 404 returned error can't find the container with id 626c224740f821b8471e8fde711a0974fe7a1cbc375f9451ebc98d0e712dd0fd Oct 14 13:09:13.389943 master-2 kubenswrapper[4762]: I1014 13:09:13.389799 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" event={"ID":"51b5c3f6-411f-499b-be0f-4d1f484c4c3c","Type":"ContainerStarted","Data":"24b5dbaadef8f24b3873f17b25ed9a8988a69c389120529304667263ceedeb33"} Oct 14 13:09:13.389943 master-2 kubenswrapper[4762]: I1014 13:09:13.389868 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" event={"ID":"51b5c3f6-411f-499b-be0f-4d1f484c4c3c","Type":"ContainerStarted","Data":"cc571a12aa7d440a725f25823028b020992612c309b231d2f8e29bc4edf3afbc"} Oct 14 13:09:13.389943 master-2 kubenswrapper[4762]: I1014 13:09:13.389891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" event={"ID":"51b5c3f6-411f-499b-be0f-4d1f484c4c3c","Type":"ContainerStarted","Data":"626c224740f821b8471e8fde711a0974fe7a1cbc375f9451ebc98d0e712dd0fd"} Oct 14 13:09:13.458210 master-2 kubenswrapper[4762]: I1014 13:09:13.451244 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-6dcc7bf8f6-s7t2x" podStartSLOduration=1.451119244 podStartE2EDuration="1.451119244s" podCreationTimestamp="2025-10-14 13:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:13.412215944 +0000 UTC m=+182.656375123" watchObservedRunningTime="2025-10-14 13:09:13.451119244 +0000 UTC m=+182.695278433" Oct 14 13:09:13.458210 master-2 kubenswrapper[4762]: I1014 13:09:13.451466 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw"] Oct 14 13:09:13.458210 master-2 kubenswrapper[4762]: I1014 13:09:13.452228 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" Oct 14 13:09:13.458210 master-2 kubenswrapper[4762]: I1014 13:09:13.456874 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw"] Oct 14 13:09:13.484601 master-2 kubenswrapper[4762]: I1014 13:09:13.484503 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ndj\" (UniqueName: \"kubernetes.io/projected/be7d3581-28db-42e4-b88a-e4e8fd75896d-kube-api-access-22ndj\") pod \"network-check-source-967c7bb47-bzqnw\" (UID: \"be7d3581-28db-42e4-b88a-e4e8fd75896d\") " pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" Oct 14 13:09:13.543872 master-2 kubenswrapper[4762]: I1014 13:09:13.543756 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:09:13.544756 master-2 kubenswrapper[4762]: I1014 13:09:13.544699 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.547692 master-2 kubenswrapper[4762]: I1014 13:09:13.547645 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 14 13:09:13.548336 master-2 kubenswrapper[4762]: I1014 13:09:13.548280 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 14 13:09:13.560248 master-2 kubenswrapper[4762]: I1014 13:09:13.558895 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:09:13.586572 master-2 kubenswrapper[4762]: I1014 13:09:13.586494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ndj\" (UniqueName: \"kubernetes.io/projected/be7d3581-28db-42e4-b88a-e4e8fd75896d-kube-api-access-22ndj\") pod \"network-check-source-967c7bb47-bzqnw\" (UID: \"be7d3581-28db-42e4-b88a-e4e8fd75896d\") " pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" Oct 14 13:09:13.613983 master-2 kubenswrapper[4762]: I1014 13:09:13.613880 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ndj\" (UniqueName: \"kubernetes.io/projected/be7d3581-28db-42e4-b88a-e4e8fd75896d-kube-api-access-22ndj\") pod \"network-check-source-967c7bb47-bzqnw\" (UID: \"be7d3581-28db-42e4-b88a-e4e8fd75896d\") " pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" Oct 14 13:09:13.688570 master-2 kubenswrapper[4762]: I1014 13:09:13.688414 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.688570 master-2 kubenswrapper[4762]: I1014 13:09:13.688559 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.688996 master-2 kubenswrapper[4762]: I1014 13:09:13.688624 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9f4f\" (UniqueName: \"kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.750044 master-2 kubenswrapper[4762]: I1014 13:09:13.749936 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:09:13.751091 master-2 kubenswrapper[4762]: I1014 13:09:13.751029 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.762894 master-2 kubenswrapper[4762]: I1014 13:09:13.762827 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:09:13.770121 master-2 kubenswrapper[4762]: I1014 13:09:13.769890 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" Oct 14 13:09:13.789512 master-2 kubenswrapper[4762]: I1014 13:09:13.789440 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.789780 master-2 kubenswrapper[4762]: I1014 13:09:13.789558 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9f4f\" (UniqueName: \"kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.789780 master-2 kubenswrapper[4762]: I1014 13:09:13.789605 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.790374 master-2 kubenswrapper[4762]: I1014 13:09:13.790312 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.790495 master-2 kubenswrapper[4762]: I1014 13:09:13.790457 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.819208 master-2 kubenswrapper[4762]: I1014 13:09:13.819119 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9f4f\" (UniqueName: \"kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f\") pod \"certified-operators-kpbmd\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.865875 master-2 kubenswrapper[4762]: I1014 13:09:13.865767 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:13.891106 master-2 kubenswrapper[4762]: I1014 13:09:13.891040 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpx44\" (UniqueName: \"kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.891203 master-2 kubenswrapper[4762]: I1014 13:09:13.891149 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.891436 master-2 kubenswrapper[4762]: I1014 13:09:13.891366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.993770 master-2 kubenswrapper[4762]: I1014 13:09:13.992993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.993770 master-2 kubenswrapper[4762]: I1014 13:09:13.993110 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpx44\" (UniqueName: \"kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.993770 master-2 kubenswrapper[4762]: I1014 13:09:13.993191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.993956 master-2 kubenswrapper[4762]: I1014 13:09:13.993850 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:13.994914 master-2 kubenswrapper[4762]: I1014 13:09:13.994779 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:14.024273 master-2 kubenswrapper[4762]: I1014 13:09:14.024014 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpx44\" (UniqueName: \"kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44\") pod \"community-operators-cf69d\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:14.064828 master-2 kubenswrapper[4762]: I1014 13:09:14.064779 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:14.220083 master-2 kubenswrapper[4762]: I1014 13:09:14.220027 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw"] Oct 14 13:09:14.229126 master-2 kubenswrapper[4762]: W1014 13:09:14.229077 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe7d3581_28db_42e4_b88a_e4e8fd75896d.slice/crio-1e820a793d25a52d48e406d0b342d60db390990ff21bde48aecca1695b2d6f66 WatchSource:0}: Error finding container 1e820a793d25a52d48e406d0b342d60db390990ff21bde48aecca1695b2d6f66: Status 404 returned error can't find the container with id 1e820a793d25a52d48e406d0b342d60db390990ff21bde48aecca1695b2d6f66 Oct 14 13:09:14.306618 master-2 kubenswrapper[4762]: I1014 13:09:14.306564 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:09:14.397236 master-2 kubenswrapper[4762]: I1014 13:09:14.397140 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" event={"ID":"be7d3581-28db-42e4-b88a-e4e8fd75896d","Type":"ContainerStarted","Data":"512a1bfb9001f17e2b08d6e649540c469e76e75874f65d8d1e7e22d17c5905f2"} Oct 14 13:09:14.397236 master-2 kubenswrapper[4762]: I1014 13:09:14.397237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" event={"ID":"be7d3581-28db-42e4-b88a-e4e8fd75896d","Type":"ContainerStarted","Data":"1e820a793d25a52d48e406d0b342d60db390990ff21bde48aecca1695b2d6f66"} Oct 14 13:09:14.398677 master-2 kubenswrapper[4762]: I1014 13:09:14.398629 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerStarted","Data":"07e46b54a3e0906bcab79074e6145a6ef11a5bc4159bce600eaa3f419715d2d5"} Oct 14 13:09:14.415304 master-2 kubenswrapper[4762]: I1014 13:09:14.414747 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-967c7bb47-bzqnw" podStartSLOduration=111.414725645 podStartE2EDuration="1m51.414725645s" podCreationTimestamp="2025-10-14 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:14.414716045 +0000 UTC m=+183.658875274" watchObservedRunningTime="2025-10-14 13:09:14.414725645 +0000 UTC m=+183.658884814" Oct 14 13:09:14.463493 master-2 kubenswrapper[4762]: I1014 13:09:14.462564 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms"] Oct 14 13:09:14.463493 master-2 kubenswrapper[4762]: I1014 13:09:14.463336 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.468527 master-2 kubenswrapper[4762]: I1014 13:09:14.468406 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 14 13:09:14.470022 master-2 kubenswrapper[4762]: I1014 13:09:14.469868 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 14 13:09:14.470713 master-2 kubenswrapper[4762]: I1014 13:09:14.470193 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 14 13:09:14.474485 master-2 kubenswrapper[4762]: I1014 13:09:14.474261 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms"] Oct 14 13:09:14.511287 master-2 kubenswrapper[4762]: I1014 13:09:14.511044 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:09:14.599115 master-2 kubenswrapper[4762]: I1014 13:09:14.599039 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/15a54d1d-6715-4afe-b6aa-8765dc254e96-tmpfs\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.599115 master-2 kubenswrapper[4762]: I1014 13:09:14.599128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcvsl\" (UniqueName: \"kubernetes.io/projected/15a54d1d-6715-4afe-b6aa-8765dc254e96-kube-api-access-hcvsl\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.599800 master-2 kubenswrapper[4762]: I1014 13:09:14.599394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-webhook-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.599800 master-2 kubenswrapper[4762]: I1014 13:09:14.599469 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-apiservice-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.701333 master-2 kubenswrapper[4762]: I1014 13:09:14.701224 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-apiservice-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.701584 master-2 kubenswrapper[4762]: I1014 13:09:14.701349 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/15a54d1d-6715-4afe-b6aa-8765dc254e96-tmpfs\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.701584 master-2 kubenswrapper[4762]: I1014 13:09:14.701389 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcvsl\" (UniqueName: \"kubernetes.io/projected/15a54d1d-6715-4afe-b6aa-8765dc254e96-kube-api-access-hcvsl\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.701584 master-2 kubenswrapper[4762]: I1014 13:09:14.701427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-webhook-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.702994 master-2 kubenswrapper[4762]: I1014 13:09:14.702789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/15a54d1d-6715-4afe-b6aa-8765dc254e96-tmpfs\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.708297 master-2 kubenswrapper[4762]: I1014 13:09:14.705580 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-apiservice-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.709175 master-2 kubenswrapper[4762]: I1014 13:09:14.709128 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15a54d1d-6715-4afe-b6aa-8765dc254e96-webhook-cert\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.729310 master-2 kubenswrapper[4762]: I1014 13:09:14.729252 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcvsl\" (UniqueName: \"kubernetes.io/projected/15a54d1d-6715-4afe-b6aa-8765dc254e96-kube-api-access-hcvsl\") pod \"packageserver-6f5778dccb-9sfms\" (UID: \"15a54d1d-6715-4afe-b6aa-8765dc254e96\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.778285 master-2 kubenswrapper[4762]: I1014 13:09:14.778103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:14.974902 master-2 kubenswrapper[4762]: I1014 13:09:14.974852 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms"] Oct 14 13:09:15.144977 master-2 kubenswrapper[4762]: I1014 13:09:15.144794 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:09:15.145553 master-2 kubenswrapper[4762]: I1014 13:09:15.145519 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.153609 master-2 kubenswrapper[4762]: I1014 13:09:15.153566 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:09:15.208230 master-2 kubenswrapper[4762]: I1014 13:09:15.208131 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww97\" (UniqueName: \"kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.208230 master-2 kubenswrapper[4762]: I1014 13:09:15.208213 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.208230 master-2 kubenswrapper[4762]: I1014 13:09:15.208241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.308843 master-2 kubenswrapper[4762]: I1014 13:09:15.308763 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww97\" (UniqueName: \"kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.308843 master-2 kubenswrapper[4762]: I1014 13:09:15.308841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.309417 master-2 kubenswrapper[4762]: I1014 13:09:15.308873 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.309520 master-2 kubenswrapper[4762]: I1014 13:09:15.309488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.310258 master-2 kubenswrapper[4762]: I1014 13:09:15.310140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.329567 master-2 kubenswrapper[4762]: I1014 13:09:15.329518 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww97\" (UniqueName: \"kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97\") pod \"redhat-marketplace-frksz\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.410610 master-2 kubenswrapper[4762]: I1014 13:09:15.407542 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" event={"ID":"15a54d1d-6715-4afe-b6aa-8765dc254e96","Type":"ContainerStarted","Data":"b2335f6aa2e3342c560c98a52ae1a3976ef2fa50a9573acacf42b3a32db22258"} Oct 14 13:09:15.410610 master-2 kubenswrapper[4762]: I1014 13:09:15.410327 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerStarted","Data":"7cb0f127c4c2430366fcaf657ff99060aa4fdc3db081f400c75c2dce7b111bfc"} Oct 14 13:09:15.459119 master-2 kubenswrapper[4762]: I1014 13:09:15.458645 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:15.569864 master-2 kubenswrapper[4762]: I1014 13:09:15.569629 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5ddb89f76-887cs"] Oct 14 13:09:15.571444 master-2 kubenswrapper[4762]: I1014 13:09:15.570504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.572512 master-2 kubenswrapper[4762]: I1014 13:09:15.572491 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj"] Oct 14 13:09:15.575226 master-2 kubenswrapper[4762]: I1014 13:09:15.574233 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 14 13:09:15.575226 master-2 kubenswrapper[4762]: I1014 13:09:15.574888 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 14 13:09:15.575226 master-2 kubenswrapper[4762]: I1014 13:09:15.575170 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 14 13:09:15.575469 master-2 kubenswrapper[4762]: I1014 13:09:15.575294 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 14 13:09:15.575469 master-2 kubenswrapper[4762]: I1014 13:09:15.575424 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 14 13:09:15.575582 master-2 kubenswrapper[4762]: I1014 13:09:15.575565 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 14 13:09:15.576004 master-2 kubenswrapper[4762]: I1014 13:09:15.575988 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj"] Oct 14 13:09:15.576208 master-2 kubenswrapper[4762]: I1014 13:09:15.576196 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:15.583275 master-2 kubenswrapper[4762]: I1014 13:09:15.581830 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Oct 14 13:09:15.613558 master-2 kubenswrapper[4762]: I1014 13:09:15.613496 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f82e0c58-e2a3-491a-bf03-ad47b38c5833-service-ca-bundle\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.613711 master-2 kubenswrapper[4762]: I1014 13:09:15.613613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-stats-auth\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.613912 master-2 kubenswrapper[4762]: I1014 13:09:15.613860 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgnx2\" (UniqueName: \"kubernetes.io/projected/f82e0c58-e2a3-491a-bf03-ad47b38c5833-kube-api-access-kgnx2\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.614005 master-2 kubenswrapper[4762]: I1014 13:09:15.613962 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-default-certificate\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.614052 master-2 kubenswrapper[4762]: I1014 13:09:15.614018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-metrics-certs\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.614120 master-2 kubenswrapper[4762]: I1014 13:09:15.614085 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abcbfb46-51e9-40ed-8f92-415d25d25b53-tls-certificates\") pod \"prometheus-operator-admission-webhook-79d5f95f5c-btmxj\" (UID: \"abcbfb46-51e9-40ed-8f92-415d25d25b53\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:15.715908 master-2 kubenswrapper[4762]: I1014 13:09:15.715841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-default-certificate\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.716192 master-2 kubenswrapper[4762]: I1014 13:09:15.715964 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-metrics-certs\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.716192 master-2 kubenswrapper[4762]: I1014 13:09:15.716007 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abcbfb46-51e9-40ed-8f92-415d25d25b53-tls-certificates\") pod \"prometheus-operator-admission-webhook-79d5f95f5c-btmxj\" (UID: \"abcbfb46-51e9-40ed-8f92-415d25d25b53\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:15.716192 master-2 kubenswrapper[4762]: I1014 13:09:15.716041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f82e0c58-e2a3-491a-bf03-ad47b38c5833-service-ca-bundle\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.716192 master-2 kubenswrapper[4762]: I1014 13:09:15.716080 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-stats-auth\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.716343 master-2 kubenswrapper[4762]: I1014 13:09:15.716235 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgnx2\" (UniqueName: \"kubernetes.io/projected/f82e0c58-e2a3-491a-bf03-ad47b38c5833-kube-api-access-kgnx2\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.717437 master-2 kubenswrapper[4762]: I1014 13:09:15.717253 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f82e0c58-e2a3-491a-bf03-ad47b38c5833-service-ca-bundle\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.733203 master-2 kubenswrapper[4762]: I1014 13:09:15.722186 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-default-certificate\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.733203 master-2 kubenswrapper[4762]: I1014 13:09:15.722540 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-stats-auth\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.733203 master-2 kubenswrapper[4762]: I1014 13:09:15.727173 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/abcbfb46-51e9-40ed-8f92-415d25d25b53-tls-certificates\") pod \"prometheus-operator-admission-webhook-79d5f95f5c-btmxj\" (UID: \"abcbfb46-51e9-40ed-8f92-415d25d25b53\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:15.733203 master-2 kubenswrapper[4762]: I1014 13:09:15.733043 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f82e0c58-e2a3-491a-bf03-ad47b38c5833-metrics-certs\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.749648 master-2 kubenswrapper[4762]: I1014 13:09:15.749351 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgnx2\" (UniqueName: \"kubernetes.io/projected/f82e0c58-e2a3-491a-bf03-ad47b38c5833-kube-api-access-kgnx2\") pod \"router-default-5ddb89f76-887cs\" (UID: \"f82e0c58-e2a3-491a-bf03-ad47b38c5833\") " pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.854406 master-2 kubenswrapper[4762]: I1014 13:09:15.854313 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:09:15.859602 master-2 kubenswrapper[4762]: W1014 13:09:15.859562 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1458907f_e285_4301_8542_0b46ac67b02d.slice/crio-953d4365c58fad8ce4ced1af6306cf2b2fb1b3f6643be9c4a60973da8ff31def WatchSource:0}: Error finding container 953d4365c58fad8ce4ced1af6306cf2b2fb1b3f6643be9c4a60973da8ff31def: Status 404 returned error can't find the container with id 953d4365c58fad8ce4ced1af6306cf2b2fb1b3f6643be9c4a60973da8ff31def Oct 14 13:09:15.894078 master-2 kubenswrapper[4762]: I1014 13:09:15.894019 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:15.900034 master-2 kubenswrapper[4762]: I1014 13:09:15.899975 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:15.907647 master-2 kubenswrapper[4762]: W1014 13:09:15.907604 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82e0c58_e2a3_491a_bf03_ad47b38c5833.slice/crio-0807d1e761f8d955f6d2fdf7ef2392cbf14cb295b63cf5aefed266be7c08af7c WatchSource:0}: Error finding container 0807d1e761f8d955f6d2fdf7ef2392cbf14cb295b63cf5aefed266be7c08af7c: Status 404 returned error can't find the container with id 0807d1e761f8d955f6d2fdf7ef2392cbf14cb295b63cf5aefed266be7c08af7c Oct 14 13:09:16.282685 master-2 kubenswrapper[4762]: I1014 13:09:16.282482 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj"] Oct 14 13:09:16.289503 master-2 kubenswrapper[4762]: W1014 13:09:16.289445 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabcbfb46_51e9_40ed_8f92_415d25d25b53.slice/crio-42a92faebd8fe46e8d3167c30f5fc0e52153aeb1b374b88926325175ee7e3400 WatchSource:0}: Error finding container 42a92faebd8fe46e8d3167c30f5fc0e52153aeb1b374b88926325175ee7e3400: Status 404 returned error can't find the container with id 42a92faebd8fe46e8d3167c30f5fc0e52153aeb1b374b88926325175ee7e3400 Oct 14 13:09:16.345102 master-2 kubenswrapper[4762]: I1014 13:09:16.345012 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:09:16.346371 master-2 kubenswrapper[4762]: I1014 13:09:16.346330 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.356349 master-2 kubenswrapper[4762]: I1014 13:09:16.356251 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:09:16.419033 master-2 kubenswrapper[4762]: I1014 13:09:16.418976 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" event={"ID":"abcbfb46-51e9-40ed-8f92-415d25d25b53","Type":"ContainerStarted","Data":"42a92faebd8fe46e8d3167c30f5fc0e52153aeb1b374b88926325175ee7e3400"} Oct 14 13:09:16.421100 master-2 kubenswrapper[4762]: I1014 13:09:16.421069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerStarted","Data":"0807d1e761f8d955f6d2fdf7ef2392cbf14cb295b63cf5aefed266be7c08af7c"} Oct 14 13:09:16.422406 master-2 kubenswrapper[4762]: I1014 13:09:16.422381 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.422577 master-2 kubenswrapper[4762]: I1014 13:09:16.422441 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.422577 master-2 kubenswrapper[4762]: I1014 13:09:16.422459 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5f4n\" (UniqueName: \"kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.422757 master-2 kubenswrapper[4762]: I1014 13:09:16.422727 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerStarted","Data":"953d4365c58fad8ce4ced1af6306cf2b2fb1b3f6643be9c4a60973da8ff31def"} Oct 14 13:09:16.523598 master-2 kubenswrapper[4762]: I1014 13:09:16.523531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.523823 master-2 kubenswrapper[4762]: I1014 13:09:16.523654 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.523823 master-2 kubenswrapper[4762]: I1014 13:09:16.523691 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5f4n\" (UniqueName: \"kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.524456 master-2 kubenswrapper[4762]: I1014 13:09:16.524413 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.524519 master-2 kubenswrapper[4762]: I1014 13:09:16.524463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.547715 master-2 kubenswrapper[4762]: I1014 13:09:16.547642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5f4n\" (UniqueName: \"kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n\") pod \"redhat-operators-xl9gv\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.665437 master-2 kubenswrapper[4762]: I1014 13:09:16.665376 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:16.838855 master-2 kubenswrapper[4762]: I1014 13:09:16.838744 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zxkbj"] Oct 14 13:09:16.839397 master-2 kubenswrapper[4762]: I1014 13:09:16.839370 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:16.842730 master-2 kubenswrapper[4762]: I1014 13:09:16.842549 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 14 13:09:16.843297 master-2 kubenswrapper[4762]: I1014 13:09:16.842749 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 14 13:09:16.928852 master-2 kubenswrapper[4762]: I1014 13:09:16.928790 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcms\" (UniqueName: \"kubernetes.io/projected/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-kube-api-access-drcms\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:16.929058 master-2 kubenswrapper[4762]: I1014 13:09:16.928866 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-node-bootstrap-token\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:16.929058 master-2 kubenswrapper[4762]: I1014 13:09:16.928896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-certs\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.030179 master-2 kubenswrapper[4762]: I1014 13:09:17.030104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcms\" (UniqueName: \"kubernetes.io/projected/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-kube-api-access-drcms\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.030179 master-2 kubenswrapper[4762]: I1014 13:09:17.030192 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-node-bootstrap-token\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.030528 master-2 kubenswrapper[4762]: I1014 13:09:17.030234 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-certs\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.035607 master-2 kubenswrapper[4762]: I1014 13:09:17.035572 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-certs\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.036410 master-2 kubenswrapper[4762]: I1014 13:09:17.036378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-node-bootstrap-token\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.060314 master-2 kubenswrapper[4762]: I1014 13:09:17.060268 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcms\" (UniqueName: \"kubernetes.io/projected/063758c3-98fe-4ac4-b1c2-beef7ef6dcdc-kube-api-access-drcms\") pod \"machine-config-server-zxkbj\" (UID: \"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc\") " pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.114959 master-2 kubenswrapper[4762]: I1014 13:09:17.114805 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:09:17.153667 master-2 kubenswrapper[4762]: I1014 13:09:17.153611 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zxkbj" Oct 14 13:09:17.299000 master-2 kubenswrapper[4762]: W1014 13:09:17.298928 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc551a119_e58d_46c3_9f81_7c0400c70c27.slice/crio-bf214f48ac34db90286eb9ca280e23dfb931c49f0756b8ace1917791aaf453a6 WatchSource:0}: Error finding container bf214f48ac34db90286eb9ca280e23dfb931c49f0756b8ace1917791aaf453a6: Status 404 returned error can't find the container with id bf214f48ac34db90286eb9ca280e23dfb931c49f0756b8ace1917791aaf453a6 Oct 14 13:09:17.431753 master-2 kubenswrapper[4762]: I1014 13:09:17.431699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerStarted","Data":"bf214f48ac34db90286eb9ca280e23dfb931c49f0756b8ace1917791aaf453a6"} Oct 14 13:09:17.841357 master-2 kubenswrapper[4762]: I1014 13:09:17.841000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:09:17.844650 master-2 kubenswrapper[4762]: I1014 13:09:17.844599 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5e8cdcd-bc1f-4b38-a834-809b79de4fd9-metrics-certs\") pod \"network-metrics-daemon-b84p7\" (UID: \"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9\") " pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:09:18.063708 master-2 kubenswrapper[4762]: I1014 13:09:18.063627 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b84p7" Oct 14 13:09:18.106777 master-2 kubenswrapper[4762]: W1014 13:09:18.106565 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063758c3_98fe_4ac4_b1c2_beef7ef6dcdc.slice/crio-725ef9561aeebea8c3c3c5d0d9400272ea344b5a1226f01e8c073ac3f01162c6 WatchSource:0}: Error finding container 725ef9561aeebea8c3c3c5d0d9400272ea344b5a1226f01e8c073ac3f01162c6: Status 404 returned error can't find the container with id 725ef9561aeebea8c3c3c5d0d9400272ea344b5a1226f01e8c073ac3f01162c6 Oct 14 13:09:18.308420 master-2 kubenswrapper[4762]: I1014 13:09:18.308369 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b84p7"] Oct 14 13:09:18.315809 master-2 kubenswrapper[4762]: W1014 13:09:18.315756 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e8cdcd_bc1f_4b38_a834_809b79de4fd9.slice/crio-ff4d93f7542e6ce8215770ef86728642ed84b257bc72eafa100640c23c9fa77f WatchSource:0}: Error finding container ff4d93f7542e6ce8215770ef86728642ed84b257bc72eafa100640c23c9fa77f: Status 404 returned error can't find the container with id ff4d93f7542e6ce8215770ef86728642ed84b257bc72eafa100640c23c9fa77f Oct 14 13:09:18.441452 master-2 kubenswrapper[4762]: I1014 13:09:18.441382 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerStarted","Data":"55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047"} Oct 14 13:09:18.443679 master-2 kubenswrapper[4762]: I1014 13:09:18.443138 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zxkbj" event={"ID":"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc","Type":"ContainerStarted","Data":"72b1bce97ab52d06d3b7c159775e2357fbd174f8c0dc525150d312ab6c8efc8d"} Oct 14 13:09:18.443679 master-2 kubenswrapper[4762]: I1014 13:09:18.443191 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zxkbj" event={"ID":"063758c3-98fe-4ac4-b1c2-beef7ef6dcdc","Type":"ContainerStarted","Data":"725ef9561aeebea8c3c3c5d0d9400272ea344b5a1226f01e8c073ac3f01162c6"} Oct 14 13:09:18.447791 master-2 kubenswrapper[4762]: I1014 13:09:18.444820 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" event={"ID":"abcbfb46-51e9-40ed-8f92-415d25d25b53","Type":"ContainerStarted","Data":"96bc5fd6e88248b75773754d6f9da4635ac86677921d880dbee52f06561dfb38"} Oct 14 13:09:18.447791 master-2 kubenswrapper[4762]: I1014 13:09:18.445580 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:18.447791 master-2 kubenswrapper[4762]: I1014 13:09:18.447226 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b84p7" event={"ID":"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9","Type":"ContainerStarted","Data":"ff4d93f7542e6ce8215770ef86728642ed84b257bc72eafa100640c23c9fa77f"} Oct 14 13:09:18.467652 master-2 kubenswrapper[4762]: I1014 13:09:18.467578 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" Oct 14 13:09:18.498849 master-2 kubenswrapper[4762]: I1014 13:09:18.498663 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-79d5f95f5c-btmxj" podStartSLOduration=8.688362195 podStartE2EDuration="10.498639572s" podCreationTimestamp="2025-10-14 13:09:08 +0000 UTC" firstStartedPulling="2025-10-14 13:09:16.293452813 +0000 UTC m=+185.537611972" lastFinishedPulling="2025-10-14 13:09:18.1037302 +0000 UTC m=+187.347889349" observedRunningTime="2025-10-14 13:09:18.49699591 +0000 UTC m=+187.741155089" watchObservedRunningTime="2025-10-14 13:09:18.498639572 +0000 UTC m=+187.742798741" Oct 14 13:09:18.504849 master-2 kubenswrapper[4762]: I1014 13:09:18.503855 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5ddb89f76-887cs" podStartSLOduration=29.28803642 podStartE2EDuration="31.503824657s" podCreationTimestamp="2025-10-14 13:08:47 +0000 UTC" firstStartedPulling="2025-10-14 13:09:15.91107426 +0000 UTC m=+185.155233429" lastFinishedPulling="2025-10-14 13:09:18.126862507 +0000 UTC m=+187.371021666" observedRunningTime="2025-10-14 13:09:18.470373681 +0000 UTC m=+187.714532840" watchObservedRunningTime="2025-10-14 13:09:18.503824657 +0000 UTC m=+187.747983816" Oct 14 13:09:18.532972 master-2 kubenswrapper[4762]: I1014 13:09:18.516752 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zxkbj" podStartSLOduration=2.516709177 podStartE2EDuration="2.516709177s" podCreationTimestamp="2025-10-14 13:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:18.515371005 +0000 UTC m=+187.759530164" watchObservedRunningTime="2025-10-14 13:09:18.516709177 +0000 UTC m=+187.760868336" Oct 14 13:09:18.894945 master-2 kubenswrapper[4762]: I1014 13:09:18.894824 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:18.897683 master-2 kubenswrapper[4762]: I1014 13:09:18.897649 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:18.897683 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:18.897683 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:18.897683 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:18.897816 master-2 kubenswrapper[4762]: I1014 13:09:18.897693 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:19.897007 master-2 kubenswrapper[4762]: I1014 13:09:19.896942 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:19.897007 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:19.897007 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:19.897007 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:19.897734 master-2 kubenswrapper[4762]: I1014 13:09:19.897031 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:20.897761 master-2 kubenswrapper[4762]: I1014 13:09:20.897566 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:20.897761 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:20.897761 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:20.897761 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:20.897761 master-2 kubenswrapper[4762]: I1014 13:09:20.897652 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:21.897413 master-2 kubenswrapper[4762]: I1014 13:09:21.897331 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:21.897413 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:21.897413 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:21.897413 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:21.898847 master-2 kubenswrapper[4762]: I1014 13:09:21.897421 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:22.898055 master-2 kubenswrapper[4762]: I1014 13:09:22.897907 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:22.898055 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:22.898055 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:22.898055 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:22.898055 master-2 kubenswrapper[4762]: I1014 13:09:22.897981 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:23.548735 master-2 kubenswrapper[4762]: I1014 13:09:23.548629 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:09:23.549253 master-2 kubenswrapper[4762]: E1014 13:09:23.549032 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:09:23.897215 master-2 kubenswrapper[4762]: I1014 13:09:23.897056 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:23.897215 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:23.897215 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:23.897215 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:23.897715 master-2 kubenswrapper[4762]: I1014 13:09:23.897666 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:24.212998 master-2 kubenswrapper[4762]: I1014 13:09:24.212900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:09:24.213827 master-2 kubenswrapper[4762]: E1014 13:09:24.213105 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:24.213827 master-2 kubenswrapper[4762]: E1014 13:09:24.213257 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:56.213229544 +0000 UTC m=+225.457388743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:09:24.898487 master-2 kubenswrapper[4762]: I1014 13:09:24.898374 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:24.898487 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:24.898487 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:24.898487 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:24.898894 master-2 kubenswrapper[4762]: I1014 13:09:24.898499 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:25.894797 master-2 kubenswrapper[4762]: I1014 13:09:25.894686 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:09:25.897496 master-2 kubenswrapper[4762]: I1014 13:09:25.897408 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:25.897496 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:25.897496 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:25.897496 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:25.897496 master-2 kubenswrapper[4762]: I1014 13:09:25.897475 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:26.900116 master-2 kubenswrapper[4762]: I1014 13:09:26.900056 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:26.900116 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:26.900116 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:26.900116 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:26.913837 master-2 kubenswrapper[4762]: I1014 13:09:26.900143 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:27.490300 master-2 kubenswrapper[4762]: I1014 13:09:27.489860 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b84p7" event={"ID":"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9","Type":"ContainerStarted","Data":"2479e2ebadf4ed2bc76ba9117cbaa54c1673aa7525aacd4a10d7ad1a556b8352"} Oct 14 13:09:27.490300 master-2 kubenswrapper[4762]: I1014 13:09:27.489959 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b84p7" event={"ID":"c5e8cdcd-bc1f-4b38-a834-809b79de4fd9","Type":"ContainerStarted","Data":"9f1c1ccf611d06075ab7fe5c084fecbc134459c290521e809ade52bfc8f13192"} Oct 14 13:09:27.492516 master-2 kubenswrapper[4762]: I1014 13:09:27.492439 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" event={"ID":"15a54d1d-6715-4afe-b6aa-8765dc254e96","Type":"ContainerStarted","Data":"73b6135907d3c83fe787d30e886e4d53aac19060092d92123613ab234c45550f"} Oct 14 13:09:27.492745 master-2 kubenswrapper[4762]: I1014 13:09:27.492677 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:27.496062 master-2 kubenswrapper[4762]: I1014 13:09:27.495906 4762 generic.go:334] "Generic (PLEG): container finished" podID="de57a213-4820-46c7-9506-4c3ea762d75f" containerID="a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91" exitCode=0 Oct 14 13:09:27.496062 master-2 kubenswrapper[4762]: I1014 13:09:27.495955 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerDied","Data":"a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91"} Oct 14 13:09:27.498304 master-2 kubenswrapper[4762]: I1014 13:09:27.498260 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerID="a6622de65bab02e973869481309ca216d3dc68faab59179b0e0b5cb3eaa9812b" exitCode=0 Oct 14 13:09:27.498583 master-2 kubenswrapper[4762]: I1014 13:09:27.498349 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerDied","Data":"a6622de65bab02e973869481309ca216d3dc68faab59179b0e0b5cb3eaa9812b"} Oct 14 13:09:27.500144 master-2 kubenswrapper[4762]: I1014 13:09:27.500042 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" Oct 14 13:09:27.500726 master-2 kubenswrapper[4762]: I1014 13:09:27.500665 4762 generic.go:334] "Generic (PLEG): container finished" podID="1458907f-e285-4301-8542-0b46ac67b02d" containerID="a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f" exitCode=0 Oct 14 13:09:27.500789 master-2 kubenswrapper[4762]: I1014 13:09:27.500701 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerDied","Data":"a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f"} Oct 14 13:09:27.503571 master-2 kubenswrapper[4762]: I1014 13:09:27.503541 4762 generic.go:334] "Generic (PLEG): container finished" podID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerID="232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729" exitCode=0 Oct 14 13:09:27.503637 master-2 kubenswrapper[4762]: I1014 13:09:27.503574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerDied","Data":"232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729"} Oct 14 13:09:27.513427 master-2 kubenswrapper[4762]: I1014 13:09:27.511636 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-b84p7" podStartSLOduration=130.117593723 podStartE2EDuration="2m18.511618814s" podCreationTimestamp="2025-10-14 13:07:09 +0000 UTC" firstStartedPulling="2025-10-14 13:09:18.318461551 +0000 UTC m=+187.562620710" lastFinishedPulling="2025-10-14 13:09:26.712486602 +0000 UTC m=+195.956645801" observedRunningTime="2025-10-14 13:09:27.507692008 +0000 UTC m=+196.751851177" watchObservedRunningTime="2025-10-14 13:09:27.511618814 +0000 UTC m=+196.755777983" Oct 14 13:09:27.624488 master-2 kubenswrapper[4762]: I1014 13:09:27.624321 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-6f5778dccb-9sfms" podStartSLOduration=1.8427460930000001 podStartE2EDuration="13.624287222s" podCreationTimestamp="2025-10-14 13:09:14 +0000 UTC" firstStartedPulling="2025-10-14 13:09:14.986236784 +0000 UTC m=+184.230395933" lastFinishedPulling="2025-10-14 13:09:26.767777903 +0000 UTC m=+196.011937062" observedRunningTime="2025-10-14 13:09:27.587985596 +0000 UTC m=+196.832144785" watchObservedRunningTime="2025-10-14 13:09:27.624287222 +0000 UTC m=+196.868446411" Oct 14 13:09:27.898953 master-2 kubenswrapper[4762]: I1014 13:09:27.898745 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:27.898953 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:27.898953 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:27.898953 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:27.898953 master-2 kubenswrapper[4762]: I1014 13:09:27.898872 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:28.897465 master-2 kubenswrapper[4762]: I1014 13:09:28.897413 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:28.897465 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:28.897465 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:28.897465 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:28.898251 master-2 kubenswrapper[4762]: I1014 13:09:28.897490 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:29.897758 master-2 kubenswrapper[4762]: I1014 13:09:29.896786 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:29.897758 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:29.897758 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:29.897758 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:29.897758 master-2 kubenswrapper[4762]: I1014 13:09:29.896844 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:30.611100 master-2 kubenswrapper[4762]: I1014 13:09:30.611030 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b"] Oct 14 13:09:30.611828 master-2 kubenswrapper[4762]: I1014 13:09:30.611796 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.618122 master-2 kubenswrapper[4762]: I1014 13:09:30.618082 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Oct 14 13:09:30.621860 master-2 kubenswrapper[4762]: I1014 13:09:30.621776 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Oct 14 13:09:30.623315 master-2 kubenswrapper[4762]: I1014 13:09:30.622341 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Oct 14 13:09:30.623315 master-2 kubenswrapper[4762]: I1014 13:09:30.622800 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Oct 14 13:09:30.627777 master-2 kubenswrapper[4762]: I1014 13:09:30.627717 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b"] Oct 14 13:09:30.638036 master-2 kubenswrapper[4762]: I1014 13:09:30.637975 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Oct 14 13:09:30.684682 master-2 kubenswrapper[4762]: I1014 13:09:30.684618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-tls\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.684682 master-2 kubenswrapper[4762]: I1014 13:09:30.684686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34749ef7-edb0-466b-a317-bb788dc5b851-metrics-client-ca\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.684986 master-2 kubenswrapper[4762]: I1014 13:09:30.684730 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.684986 master-2 kubenswrapper[4762]: I1014 13:09:30.684785 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2rgr\" (UniqueName: \"kubernetes.io/projected/34749ef7-edb0-466b-a317-bb788dc5b851-kube-api-access-h2rgr\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.785454 master-2 kubenswrapper[4762]: I1014 13:09:30.785387 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.785454 master-2 kubenswrapper[4762]: I1014 13:09:30.785453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2rgr\" (UniqueName: \"kubernetes.io/projected/34749ef7-edb0-466b-a317-bb788dc5b851-kube-api-access-h2rgr\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.785796 master-2 kubenswrapper[4762]: I1014 13:09:30.785511 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-tls\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.785796 master-2 kubenswrapper[4762]: I1014 13:09:30.785531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34749ef7-edb0-466b-a317-bb788dc5b851-metrics-client-ca\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.786417 master-2 kubenswrapper[4762]: I1014 13:09:30.786383 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34749ef7-edb0-466b-a317-bb788dc5b851-metrics-client-ca\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.789208 master-2 kubenswrapper[4762]: I1014 13:09:30.789147 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.789352 master-2 kubenswrapper[4762]: I1014 13:09:30.789221 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34749ef7-edb0-466b-a317-bb788dc5b851-prometheus-operator-tls\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.812380 master-2 kubenswrapper[4762]: I1014 13:09:30.812313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2rgr\" (UniqueName: \"kubernetes.io/projected/34749ef7-edb0-466b-a317-bb788dc5b851-kube-api-access-h2rgr\") pod \"prometheus-operator-574d7f8db8-gbr5b\" (UID: \"34749ef7-edb0-466b-a317-bb788dc5b851\") " pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:30.897323 master-2 kubenswrapper[4762]: I1014 13:09:30.897124 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:30.897323 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:30.897323 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:30.897323 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:30.897323 master-2 kubenswrapper[4762]: I1014 13:09:30.897217 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:30.935745 master-2 kubenswrapper[4762]: I1014 13:09:30.935659 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" Oct 14 13:09:31.360004 master-2 kubenswrapper[4762]: I1014 13:09:31.359928 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b"] Oct 14 13:09:31.368927 master-2 kubenswrapper[4762]: W1014 13:09:31.368855 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34749ef7_edb0_466b_a317_bb788dc5b851.slice/crio-82a97c2a1e1fea5f344b24a5e83842552aa89bf650799fc76a9790fa1b5f58f9 WatchSource:0}: Error finding container 82a97c2a1e1fea5f344b24a5e83842552aa89bf650799fc76a9790fa1b5f58f9: Status 404 returned error can't find the container with id 82a97c2a1e1fea5f344b24a5e83842552aa89bf650799fc76a9790fa1b5f58f9 Oct 14 13:09:31.521357 master-2 kubenswrapper[4762]: I1014 13:09:31.521293 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" event={"ID":"34749ef7-edb0-466b-a317-bb788dc5b851","Type":"ContainerStarted","Data":"82a97c2a1e1fea5f344b24a5e83842552aa89bf650799fc76a9790fa1b5f58f9"} Oct 14 13:09:31.896461 master-2 kubenswrapper[4762]: I1014 13:09:31.896402 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:31.896461 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:31.896461 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:31.896461 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:31.896802 master-2 kubenswrapper[4762]: I1014 13:09:31.896471 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:32.897581 master-2 kubenswrapper[4762]: I1014 13:09:32.897530 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:32.897581 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:32.897581 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:32.897581 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:32.898089 master-2 kubenswrapper[4762]: I1014 13:09:32.897602 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:33.182536 master-2 kubenswrapper[4762]: I1014 13:09:33.182494 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-sg92v"] Oct 14 13:09:33.183292 master-2 kubenswrapper[4762]: I1014 13:09:33.183269 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.186909 master-2 kubenswrapper[4762]: I1014 13:09:33.186866 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Oct 14 13:09:33.212568 master-2 kubenswrapper[4762]: I1014 13:09:33.212530 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.212640 master-2 kubenswrapper[4762]: I1014 13:09:33.212585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2n2\" (UniqueName: \"kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.212640 master-2 kubenswrapper[4762]: I1014 13:09:33.212630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.212704 master-2 kubenswrapper[4762]: I1014 13:09:33.212687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.313946 master-2 kubenswrapper[4762]: I1014 13:09:33.313888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.313946 master-2 kubenswrapper[4762]: I1014 13:09:33.313957 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.314232 master-2 kubenswrapper[4762]: I1014 13:09:33.313990 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2n2\" (UniqueName: \"kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.314232 master-2 kubenswrapper[4762]: I1014 13:09:33.314047 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.314232 master-2 kubenswrapper[4762]: I1014 13:09:33.314166 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.314705 master-2 kubenswrapper[4762]: I1014 13:09:33.314682 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.314791 master-2 kubenswrapper[4762]: I1014 13:09:33.314754 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.331535 master-2 kubenswrapper[4762]: I1014 13:09:33.331501 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2n2\" (UniqueName: \"kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2\") pod \"cni-sysctl-allowlist-ds-sg92v\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.496660 master-2 kubenswrapper[4762]: I1014 13:09:33.496600 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:33.510222 master-2 kubenswrapper[4762]: W1014 13:09:33.510133 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod970e8ee9_e505_4a07_9662_362652cf6b3b.slice/crio-b2752f3c8de711a76ebcd5756bfc5b53dacbe058eb3bbbdf9186c5ff14187ae3 WatchSource:0}: Error finding container b2752f3c8de711a76ebcd5756bfc5b53dacbe058eb3bbbdf9186c5ff14187ae3: Status 404 returned error can't find the container with id b2752f3c8de711a76ebcd5756bfc5b53dacbe058eb3bbbdf9186c5ff14187ae3 Oct 14 13:09:33.529978 master-2 kubenswrapper[4762]: I1014 13:09:33.529932 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" event={"ID":"970e8ee9-e505-4a07-9662-362652cf6b3b","Type":"ContainerStarted","Data":"b2752f3c8de711a76ebcd5756bfc5b53dacbe058eb3bbbdf9186c5ff14187ae3"} Oct 14 13:09:33.532319 master-2 kubenswrapper[4762]: I1014 13:09:33.532245 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" event={"ID":"34749ef7-edb0-466b-a317-bb788dc5b851","Type":"ContainerStarted","Data":"3646a87a3d036a8991fd56c519abddf638e5ead13a0b92a5a57c43cf5e46c485"} Oct 14 13:09:33.532319 master-2 kubenswrapper[4762]: I1014 13:09:33.532297 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" event={"ID":"34749ef7-edb0-466b-a317-bb788dc5b851","Type":"ContainerStarted","Data":"734037e5fda6f620c64b4c05b54aa689433ac444b8dd58fa343af045164a6405"} Oct 14 13:09:33.549627 master-2 kubenswrapper[4762]: I1014 13:09:33.549564 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-574d7f8db8-gbr5b" podStartSLOduration=1.913728219 podStartE2EDuration="3.549545337s" podCreationTimestamp="2025-10-14 13:09:30 +0000 UTC" firstStartedPulling="2025-10-14 13:09:31.374265401 +0000 UTC m=+200.618424560" lastFinishedPulling="2025-10-14 13:09:33.010082509 +0000 UTC m=+202.254241678" observedRunningTime="2025-10-14 13:09:33.549480255 +0000 UTC m=+202.793639434" watchObservedRunningTime="2025-10-14 13:09:33.549545337 +0000 UTC m=+202.793704496" Oct 14 13:09:33.897874 master-2 kubenswrapper[4762]: I1014 13:09:33.897738 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:33.897874 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:33.897874 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:33.897874 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:33.897874 master-2 kubenswrapper[4762]: I1014 13:09:33.897825 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:34.538226 master-2 kubenswrapper[4762]: I1014 13:09:34.538172 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" event={"ID":"970e8ee9-e505-4a07-9662-362652cf6b3b","Type":"ContainerStarted","Data":"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2"} Oct 14 13:09:34.538521 master-2 kubenswrapper[4762]: I1014 13:09:34.538496 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:34.559048 master-2 kubenswrapper[4762]: I1014 13:09:34.558954 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" podStartSLOduration=1.558925627 podStartE2EDuration="1.558925627s" podCreationTimestamp="2025-10-14 13:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:34.556881352 +0000 UTC m=+203.801040511" watchObservedRunningTime="2025-10-14 13:09:34.558925627 +0000 UTC m=+203.803084796" Oct 14 13:09:34.561936 master-2 kubenswrapper[4762]: I1014 13:09:34.561904 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:09:34.897326 master-2 kubenswrapper[4762]: I1014 13:09:34.897044 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:34.897326 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:34.897326 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:34.897326 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:34.897326 master-2 kubenswrapper[4762]: I1014 13:09:34.897137 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:35.187353 master-2 kubenswrapper[4762]: I1014 13:09:35.187299 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-sg92v"] Oct 14 13:09:35.553988 master-2 kubenswrapper[4762]: I1014 13:09:35.548935 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:09:35.553988 master-2 kubenswrapper[4762]: E1014 13:09:35.549295 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-rbac-proxy pod=cluster-cloud-controller-manager-operator-779749f859-bscv5_openshift-cloud-controller-manager-operator(18346e46-a062-4e0d-b90a-c05646a46c7e)\"" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podUID="18346e46-a062-4e0d-b90a-c05646a46c7e" Oct 14 13:09:35.897286 master-2 kubenswrapper[4762]: I1014 13:09:35.897103 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:35.897286 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:35.897286 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:35.897286 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:35.897286 master-2 kubenswrapper[4762]: I1014 13:09:35.897196 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:36.546577 master-2 kubenswrapper[4762]: I1014 13:09:36.546512 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" gracePeriod=30 Oct 14 13:09:36.897044 master-2 kubenswrapper[4762]: I1014 13:09:36.896651 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:36.897044 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:36.897044 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:36.897044 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:36.897044 master-2 kubenswrapper[4762]: I1014 13:09:36.896726 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:37.254841 master-2 kubenswrapper[4762]: I1014 13:09:37.254685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:09:37.254841 master-2 kubenswrapper[4762]: E1014 13:09:37.254840 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:37.255246 master-2 kubenswrapper[4762]: E1014 13:09:37.254924 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:10:41.254900933 +0000 UTC m=+270.499060112 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:09:37.897829 master-2 kubenswrapper[4762]: I1014 13:09:37.897739 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:37.897829 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:37.897829 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:37.897829 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:37.898812 master-2 kubenswrapper[4762]: I1014 13:09:37.898707 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:38.896658 master-2 kubenswrapper[4762]: I1014 13:09:38.896594 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:38.896658 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:38.896658 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:38.896658 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:38.897006 master-2 kubenswrapper[4762]: I1014 13:09:38.896683 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:39.898003 master-2 kubenswrapper[4762]: I1014 13:09:39.897910 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:39.898003 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:39.898003 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:39.898003 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:39.898754 master-2 kubenswrapper[4762]: I1014 13:09:39.898008 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:39.938017 master-2 kubenswrapper[4762]: I1014 13:09:39.937961 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4"] Oct 14 13:09:39.939181 master-2 kubenswrapper[4762]: I1014 13:09:39.939123 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:39.942088 master-2 kubenswrapper[4762]: I1014 13:09:39.942032 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Oct 14 13:09:39.942308 master-2 kubenswrapper[4762]: I1014 13:09:39.942238 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Oct 14 13:09:39.949413 master-2 kubenswrapper[4762]: I1014 13:09:39.949342 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4"] Oct 14 13:09:39.955003 master-2 kubenswrapper[4762]: I1014 13:09:39.954957 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jc698"] Oct 14 13:09:39.956058 master-2 kubenswrapper[4762]: I1014 13:09:39.956021 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.956250 master-2 kubenswrapper[4762]: I1014 13:09:39.956137 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-57fbd47578-96mh2"] Oct 14 13:09:39.957245 master-2 kubenswrapper[4762]: I1014 13:09:39.956898 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.962113 master-2 kubenswrapper[4762]: I1014 13:09:39.961933 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Oct 14 13:09:39.962113 master-2 kubenswrapper[4762]: I1014 13:09:39.961965 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Oct 14 13:09:39.963809 master-2 kubenswrapper[4762]: I1014 13:09:39.962139 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Oct 14 13:09:39.963809 master-2 kubenswrapper[4762]: I1014 13:09:39.963553 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Oct 14 13:09:39.963809 master-2 kubenswrapper[4762]: I1014 13:09:39.963768 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Oct 14 13:09:39.965916 master-2 kubenswrapper[4762]: I1014 13:09:39.965875 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-57fbd47578-96mh2"] Oct 14 13:09:39.982739 master-2 kubenswrapper[4762]: I1014 13:09:39.982682 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-sys\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.982858 master-2 kubenswrapper[4762]: I1014 13:09:39.982750 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.982858 master-2 kubenswrapper[4762]: I1014 13:09:39.982790 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-wtmp\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.982858 master-2 kubenswrapper[4762]: I1014 13:09:39.982828 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hmls\" (UniqueName: \"kubernetes.io/projected/da6c08c6-be65-4110-a7e7-b7d5477ae716-kube-api-access-2hmls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:39.982990 master-2 kubenswrapper[4762]: I1014 13:09:39.982857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.982990 master-2 kubenswrapper[4762]: I1014 13:09:39.982903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02c9abb4-532d-4831-b69c-7445bfe51494-volume-directive-shadow\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.982990 master-2 kubenswrapper[4762]: I1014 13:09:39.982941 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6c08c6-be65-4110-a7e7-b7d5477ae716-metrics-client-ca\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:39.982990 master-2 kubenswrapper[4762]: I1014 13:09:39.982983 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.983169 master-2 kubenswrapper[4762]: I1014 13:09:39.983018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-textfile\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.983169 master-2 kubenswrapper[4762]: I1014 13:09:39.983057 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:39.983169 master-2 kubenswrapper[4762]: I1014 13:09:39.983134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.983307 master-2 kubenswrapper[4762]: I1014 13:09:39.983229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-tls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:39.983307 master-2 kubenswrapper[4762]: I1014 13:09:39.983291 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmtb\" (UniqueName: \"kubernetes.io/projected/02c9abb4-532d-4831-b69c-7445bfe51494-kube-api-access-7hmtb\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.983391 master-2 kubenswrapper[4762]: I1014 13:09:39.983352 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-metrics-client-ca\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.983435 master-2 kubenswrapper[4762]: I1014 13:09:39.983391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c742\" (UniqueName: \"kubernetes.io/projected/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-kube-api-access-7c742\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.983478 master-2 kubenswrapper[4762]: I1014 13:09:39.983435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-metrics-client-ca\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:39.983478 master-2 kubenswrapper[4762]: I1014 13:09:39.983470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:39.983560 master-2 kubenswrapper[4762]: I1014 13:09:39.983527 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-root\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084168 master-2 kubenswrapper[4762]: I1014 13:09:40.084103 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-sys\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084168 master-2 kubenswrapper[4762]: I1014 13:09:40.084180 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084205 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-wtmp\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084231 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hmls\" (UniqueName: \"kubernetes.io/projected/da6c08c6-be65-4110-a7e7-b7d5477ae716-kube-api-access-2hmls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084254 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084284 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02c9abb4-532d-4831-b69c-7445bfe51494-volume-directive-shadow\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-sys\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: E1014 13:09:40.084366 4762 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084313 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6c08c6-be65-4110-a7e7-b7d5477ae716-metrics-client-ca\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: E1014 13:09:40.084433 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls podName:02c9abb4-532d-4831-b69c-7445bfe51494 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:40.584414004 +0000 UTC m=+209.828573163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls") pod "kube-state-metrics-57fbd47578-96mh2" (UID: "02c9abb4-532d-4831-b69c-7445bfe51494") : secret "kube-state-metrics-tls" not found Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084458 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-textfile\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084471 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-wtmp\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084493 master-2 kubenswrapper[4762]: I1014 13:09:40.084494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084546 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084571 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084631 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-tls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084655 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmtb\" (UniqueName: \"kubernetes.io/projected/02c9abb4-532d-4831-b69c-7445bfe51494-kube-api-access-7hmtb\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084712 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-metrics-client-ca\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084753 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c742\" (UniqueName: \"kubernetes.io/projected/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-kube-api-access-7c742\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084792 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-metrics-client-ca\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084855 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-root\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.084941 master-2 kubenswrapper[4762]: I1014 13:09:40.084944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-root\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: I1014 13:09:40.085031 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-textfile\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: E1014 13:09:40.085349 4762 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: E1014 13:09:40.085417 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls podName:c7f87a74-3d2e-4e1e-a564-4957c50f5b20 nodeName:}" failed. No retries permitted until 2025-10-14 13:09:40.585393235 +0000 UTC m=+209.829552384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls") pod "node-exporter-jc698" (UID: "c7f87a74-3d2e-4e1e-a564-4957c50f5b20") : secret "node-exporter-tls" not found Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: I1014 13:09:40.085561 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/da6c08c6-be65-4110-a7e7-b7d5477ae716-metrics-client-ca\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: I1014 13:09:40.085826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: I1014 13:09:40.085878 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02c9abb4-532d-4831-b69c-7445bfe51494-volume-directive-shadow\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.086740 master-2 kubenswrapper[4762]: I1014 13:09:40.086460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-metrics-client-ca\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.087024 master-2 kubenswrapper[4762]: I1014 13:09:40.086879 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02c9abb4-532d-4831-b69c-7445bfe51494-metrics-client-ca\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.089241 master-2 kubenswrapper[4762]: I1014 13:09:40.089209 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.090023 master-2 kubenswrapper[4762]: I1014 13:09:40.089944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.090263 master-2 kubenswrapper[4762]: I1014 13:09:40.090193 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.091084 master-2 kubenswrapper[4762]: I1014 13:09:40.090881 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/da6c08c6-be65-4110-a7e7-b7d5477ae716-openshift-state-metrics-tls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.096796 master-2 kubenswrapper[4762]: I1014 13:09:40.096733 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:09:40.101080 master-2 kubenswrapper[4762]: I1014 13:09:40.101027 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hmls\" (UniqueName: \"kubernetes.io/projected/da6c08c6-be65-4110-a7e7-b7d5477ae716-kube-api-access-2hmls\") pod \"openshift-state-metrics-56d8dcb55c-h25c4\" (UID: \"da6c08c6-be65-4110-a7e7-b7d5477ae716\") " pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.106119 master-2 kubenswrapper[4762]: I1014 13:09:40.106092 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c742\" (UniqueName: \"kubernetes.io/projected/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-kube-api-access-7c742\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.106479 master-2 kubenswrapper[4762]: I1014 13:09:40.106439 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmtb\" (UniqueName: \"kubernetes.io/projected/02c9abb4-532d-4831-b69c-7445bfe51494-kube-api-access-7hmtb\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.271146 master-2 kubenswrapper[4762]: I1014 13:09:40.271070 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" Oct 14 13:09:40.591707 master-2 kubenswrapper[4762]: I1014 13:09:40.591477 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.591707 master-2 kubenswrapper[4762]: I1014 13:09:40.591537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.595555 master-2 kubenswrapper[4762]: I1014 13:09:40.595507 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f87a74-3d2e-4e1e-a564-4957c50f5b20-node-exporter-tls\") pod \"node-exporter-jc698\" (UID: \"c7f87a74-3d2e-4e1e-a564-4957c50f5b20\") " pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.608928 master-2 kubenswrapper[4762]: I1014 13:09:40.608872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02c9abb4-532d-4831-b69c-7445bfe51494-kube-state-metrics-tls\") pod \"kube-state-metrics-57fbd47578-96mh2\" (UID: \"02c9abb4-532d-4831-b69c-7445bfe51494\") " pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.877247 master-2 kubenswrapper[4762]: I1014 13:09:40.877031 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jc698" Oct 14 13:09:40.884481 master-2 kubenswrapper[4762]: I1014 13:09:40.884395 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" Oct 14 13:09:40.897577 master-2 kubenswrapper[4762]: I1014 13:09:40.897495 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:40.897577 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:40.897577 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:40.897577 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:40.897577 master-2 kubenswrapper[4762]: I1014 13:09:40.897537 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:41.387035 master-2 kubenswrapper[4762]: W1014 13:09:41.386988 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f87a74_3d2e_4e1e_a564_4957c50f5b20.slice/crio-10fc3b5103da79e5b651df6faa464f64a839440e9c41690700e6be7c0cf56802 WatchSource:0}: Error finding container 10fc3b5103da79e5b651df6faa464f64a839440e9c41690700e6be7c0cf56802: Status 404 returned error can't find the container with id 10fc3b5103da79e5b651df6faa464f64a839440e9c41690700e6be7c0cf56802 Oct 14 13:09:41.572924 master-2 kubenswrapper[4762]: I1014 13:09:41.572874 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerStarted","Data":"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1"} Oct 14 13:09:41.574451 master-2 kubenswrapper[4762]: I1014 13:09:41.574418 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc698" event={"ID":"c7f87a74-3d2e-4e1e-a564-4957c50f5b20","Type":"ContainerStarted","Data":"10fc3b5103da79e5b651df6faa464f64a839440e9c41690700e6be7c0cf56802"} Oct 14 13:09:41.786451 master-2 kubenswrapper[4762]: I1014 13:09:41.786395 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4"] Oct 14 13:09:41.849429 master-2 kubenswrapper[4762]: I1014 13:09:41.849369 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-57fbd47578-96mh2"] Oct 14 13:09:41.897491 master-2 kubenswrapper[4762]: I1014 13:09:41.897431 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:41.897491 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:41.897491 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:41.897491 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:41.897635 master-2 kubenswrapper[4762]: I1014 13:09:41.897541 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:41.957634 master-2 kubenswrapper[4762]: W1014 13:09:41.957561 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c9abb4_532d_4831_b69c_7445bfe51494.slice/crio-09e2dd9ab729f6481a59c373007e6ad84eba1139a32925311e953c8f95b09ad1 WatchSource:0}: Error finding container 09e2dd9ab729f6481a59c373007e6ad84eba1139a32925311e953c8f95b09ad1: Status 404 returned error can't find the container with id 09e2dd9ab729f6481a59c373007e6ad84eba1139a32925311e953c8f95b09ad1 Oct 14 13:09:41.958454 master-2 kubenswrapper[4762]: W1014 13:09:41.958413 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda6c08c6_be65_4110_a7e7_b7d5477ae716.slice/crio-0882572fc1338e3d5d47f1cbaca3e989bb4c682d09a1daceb18af7b2d1e1b919 WatchSource:0}: Error finding container 0882572fc1338e3d5d47f1cbaca3e989bb4c682d09a1daceb18af7b2d1e1b919: Status 404 returned error can't find the container with id 0882572fc1338e3d5d47f1cbaca3e989bb4c682d09a1daceb18af7b2d1e1b919 Oct 14 13:09:42.588024 master-2 kubenswrapper[4762]: I1014 13:09:42.587930 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc698" event={"ID":"c7f87a74-3d2e-4e1e-a564-4957c50f5b20","Type":"ContainerStarted","Data":"5d98e09e53b3efa86f469c7e4dbe8879d948ce73874a9ed7b92bfb29be11a7dd"} Oct 14 13:09:42.591049 master-2 kubenswrapper[4762]: I1014 13:09:42.590961 4762 generic.go:334] "Generic (PLEG): container finished" podID="de57a213-4820-46c7-9506-4c3ea762d75f" containerID="0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241" exitCode=0 Oct 14 13:09:42.591049 master-2 kubenswrapper[4762]: I1014 13:09:42.591012 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerDied","Data":"0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241"} Oct 14 13:09:42.594270 master-2 kubenswrapper[4762]: I1014 13:09:42.594201 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerID="fd4f47c880f7459da4b54075d7d7b82efbefe1cf43c3d0c933cd64e3a0b2b32b" exitCode=0 Oct 14 13:09:42.594423 master-2 kubenswrapper[4762]: I1014 13:09:42.594297 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerDied","Data":"fd4f47c880f7459da4b54075d7d7b82efbefe1cf43c3d0c933cd64e3a0b2b32b"} Oct 14 13:09:42.597717 master-2 kubenswrapper[4762]: I1014 13:09:42.597543 4762 generic.go:334] "Generic (PLEG): container finished" podID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerID="1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653" exitCode=0 Oct 14 13:09:42.597717 master-2 kubenswrapper[4762]: I1014 13:09:42.597596 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerDied","Data":"1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653"} Oct 14 13:09:42.600128 master-2 kubenswrapper[4762]: I1014 13:09:42.599946 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" event={"ID":"02c9abb4-532d-4831-b69c-7445bfe51494","Type":"ContainerStarted","Data":"09e2dd9ab729f6481a59c373007e6ad84eba1139a32925311e953c8f95b09ad1"} Oct 14 13:09:42.602567 master-2 kubenswrapper[4762]: I1014 13:09:42.602483 4762 generic.go:334] "Generic (PLEG): container finished" podID="1458907f-e285-4301-8542-0b46ac67b02d" containerID="bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1" exitCode=0 Oct 14 13:09:42.602567 master-2 kubenswrapper[4762]: I1014 13:09:42.602555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerDied","Data":"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1"} Oct 14 13:09:42.605533 master-2 kubenswrapper[4762]: I1014 13:09:42.605448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" event={"ID":"da6c08c6-be65-4110-a7e7-b7d5477ae716","Type":"ContainerStarted","Data":"a99549f8bf86b78b03e9cef6f3962bfaac0fccf3a9e8b64c75b06b10bb30cd3d"} Oct 14 13:09:42.605645 master-2 kubenswrapper[4762]: I1014 13:09:42.605545 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" event={"ID":"da6c08c6-be65-4110-a7e7-b7d5477ae716","Type":"ContainerStarted","Data":"c725a1511aaba9af7d5a9138414e1dac4163cb305899f2eeb00437ef2d6cb3ec"} Oct 14 13:09:42.605645 master-2 kubenswrapper[4762]: I1014 13:09:42.605578 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" event={"ID":"da6c08c6-be65-4110-a7e7-b7d5477ae716","Type":"ContainerStarted","Data":"0882572fc1338e3d5d47f1cbaca3e989bb4c682d09a1daceb18af7b2d1e1b919"} Oct 14 13:09:42.897369 master-2 kubenswrapper[4762]: I1014 13:09:42.897148 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:42.897369 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:42.897369 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:42.897369 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:42.897369 master-2 kubenswrapper[4762]: I1014 13:09:42.897337 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:43.499545 master-2 kubenswrapper[4762]: E1014 13:09:43.499419 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:43.501239 master-2 kubenswrapper[4762]: E1014 13:09:43.501176 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:43.502843 master-2 kubenswrapper[4762]: E1014 13:09:43.502566 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:43.502843 master-2 kubenswrapper[4762]: E1014 13:09:43.502595 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:09:43.613223 master-2 kubenswrapper[4762]: I1014 13:09:43.613149 4762 generic.go:334] "Generic (PLEG): container finished" podID="c7f87a74-3d2e-4e1e-a564-4957c50f5b20" containerID="5d98e09e53b3efa86f469c7e4dbe8879d948ce73874a9ed7b92bfb29be11a7dd" exitCode=0 Oct 14 13:09:43.613934 master-2 kubenswrapper[4762]: I1014 13:09:43.613236 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc698" event={"ID":"c7f87a74-3d2e-4e1e-a564-4957c50f5b20","Type":"ContainerDied","Data":"5d98e09e53b3efa86f469c7e4dbe8879d948ce73874a9ed7b92bfb29be11a7dd"} Oct 14 13:09:43.899326 master-2 kubenswrapper[4762]: I1014 13:09:43.899237 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:43.899326 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:43.899326 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:43.899326 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:43.899621 master-2 kubenswrapper[4762]: I1014 13:09:43.899362 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:44.533766 master-2 kubenswrapper[4762]: I1014 13:09:44.533657 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:09:44.534596 master-2 kubenswrapper[4762]: I1014 13:09:44.534391 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.538084 master-2 kubenswrapper[4762]: I1014 13:09:44.538046 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 14 13:09:44.561665 master-2 kubenswrapper[4762]: I1014 13:09:44.560316 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:09:44.644339 master-2 kubenswrapper[4762]: I1014 13:09:44.643019 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc698" event={"ID":"c7f87a74-3d2e-4e1e-a564-4957c50f5b20","Type":"ContainerStarted","Data":"76e221035eb4fefe1511196d436318e31150e473207c77ec352aaaecc6af18fa"} Oct 14 13:09:44.644339 master-2 kubenswrapper[4762]: I1014 13:09:44.643080 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc698" event={"ID":"c7f87a74-3d2e-4e1e-a564-4957c50f5b20","Type":"ContainerStarted","Data":"1b5d33d5111cf888e3ca7c1a4108f1acd175461a3746fb6f5cb78c180cf7ffb5"} Oct 14 13:09:44.646539 master-2 kubenswrapper[4762]: I1014 13:09:44.646482 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerStarted","Data":"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d"} Oct 14 13:09:44.649418 master-2 kubenswrapper[4762]: I1014 13:09:44.649379 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerStarted","Data":"81aec3b49a476404f7dc250f33c9194087a82b2bf5ea44822de93e2042a91529"} Oct 14 13:09:44.650939 master-2 kubenswrapper[4762]: I1014 13:09:44.650887 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.651022 master-2 kubenswrapper[4762]: I1014 13:09:44.650938 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9t24\" (UniqueName: \"kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.656085 master-2 kubenswrapper[4762]: I1014 13:09:44.656025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" event={"ID":"02c9abb4-532d-4831-b69c-7445bfe51494","Type":"ContainerStarted","Data":"7f7d2a6cfc4e7baa75bef5e8d9719ce355330964b2873925745b2db25d77ad57"} Oct 14 13:09:44.656085 master-2 kubenswrapper[4762]: I1014 13:09:44.656071 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" event={"ID":"02c9abb4-532d-4831-b69c-7445bfe51494","Type":"ContainerStarted","Data":"cd7ebead1b07034c62524a94efacfd497ee0bd3e98eeb6351346e2507c5738f3"} Oct 14 13:09:44.656085 master-2 kubenswrapper[4762]: I1014 13:09:44.656085 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" event={"ID":"02c9abb4-532d-4831-b69c-7445bfe51494","Type":"ContainerStarted","Data":"e0f91c6b8b8337578699c2355bab69c924a955adf24891e582db9508b2ba05db"} Oct 14 13:09:44.660828 master-2 kubenswrapper[4762]: I1014 13:09:44.660016 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jc698" podStartSLOduration=4.698182841 podStartE2EDuration="5.660001706s" podCreationTimestamp="2025-10-14 13:09:39 +0000 UTC" firstStartedPulling="2025-10-14 13:09:41.390128535 +0000 UTC m=+210.634287694" lastFinishedPulling="2025-10-14 13:09:42.3519474 +0000 UTC m=+211.596106559" observedRunningTime="2025-10-14 13:09:44.657968382 +0000 UTC m=+213.902127551" watchObservedRunningTime="2025-10-14 13:09:44.660001706 +0000 UTC m=+213.904160865" Oct 14 13:09:44.662514 master-2 kubenswrapper[4762]: I1014 13:09:44.662226 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerStarted","Data":"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099"} Oct 14 13:09:44.668284 master-2 kubenswrapper[4762]: I1014 13:09:44.668249 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerStarted","Data":"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290"} Oct 14 13:09:44.672938 master-2 kubenswrapper[4762]: I1014 13:09:44.672875 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" event={"ID":"da6c08c6-be65-4110-a7e7-b7d5477ae716","Type":"ContainerStarted","Data":"0ee56fca541fb382a88f784f8939cc9ee8779a695b7c8c14de13e25f8667179f"} Oct 14 13:09:44.678053 master-2 kubenswrapper[4762]: I1014 13:09:44.677939 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-57fbd47578-96mh2" podStartSLOduration=4.024343052 podStartE2EDuration="5.677913887s" podCreationTimestamp="2025-10-14 13:09:39 +0000 UTC" firstStartedPulling="2025-10-14 13:09:41.960878979 +0000 UTC m=+211.205038148" lastFinishedPulling="2025-10-14 13:09:43.614449794 +0000 UTC m=+212.858608983" observedRunningTime="2025-10-14 13:09:44.676530383 +0000 UTC m=+213.920689552" watchObservedRunningTime="2025-10-14 13:09:44.677913887 +0000 UTC m=+213.922073046" Oct 14 13:09:44.697863 master-2 kubenswrapper[4762]: I1014 13:09:44.697711 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kpbmd" podStartSLOduration=2.390806973 podStartE2EDuration="31.697685127s" podCreationTimestamp="2025-10-14 13:09:13 +0000 UTC" firstStartedPulling="2025-10-14 13:09:14.316142914 +0000 UTC m=+183.560302063" lastFinishedPulling="2025-10-14 13:09:43.623021058 +0000 UTC m=+212.867180217" observedRunningTime="2025-10-14 13:09:44.694383722 +0000 UTC m=+213.938542911" watchObservedRunningTime="2025-10-14 13:09:44.697685127 +0000 UTC m=+213.941844286" Oct 14 13:09:44.711256 master-2 kubenswrapper[4762]: I1014 13:09:44.711136 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cf69d" podStartSLOduration=2.565600522 podStartE2EDuration="31.711105325s" podCreationTimestamp="2025-10-14 13:09:13 +0000 UTC" firstStartedPulling="2025-10-14 13:09:14.530562795 +0000 UTC m=+183.774721954" lastFinishedPulling="2025-10-14 13:09:43.676067598 +0000 UTC m=+212.920226757" observedRunningTime="2025-10-14 13:09:44.70875796 +0000 UTC m=+213.952917149" watchObservedRunningTime="2025-10-14 13:09:44.711105325 +0000 UTC m=+213.955264524" Oct 14 13:09:44.726780 master-2 kubenswrapper[4762]: I1014 13:09:44.726636 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xl9gv" podStartSLOduration=2.401402384 podStartE2EDuration="28.726607738s" podCreationTimestamp="2025-10-14 13:09:16 +0000 UTC" firstStartedPulling="2025-10-14 13:09:17.301091897 +0000 UTC m=+186.545251056" lastFinishedPulling="2025-10-14 13:09:43.626297211 +0000 UTC m=+212.870456410" observedRunningTime="2025-10-14 13:09:44.722599431 +0000 UTC m=+213.966758630" watchObservedRunningTime="2025-10-14 13:09:44.726607738 +0000 UTC m=+213.970766937" Oct 14 13:09:44.738756 master-2 kubenswrapper[4762]: I1014 13:09:44.738659 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frksz" podStartSLOduration=1.977586939 podStartE2EDuration="29.738623671s" podCreationTimestamp="2025-10-14 13:09:15 +0000 UTC" firstStartedPulling="2025-10-14 13:09:15.861746948 +0000 UTC m=+185.105906107" lastFinishedPulling="2025-10-14 13:09:43.62278365 +0000 UTC m=+212.866942839" observedRunningTime="2025-10-14 13:09:44.735976977 +0000 UTC m=+213.980136176" watchObservedRunningTime="2025-10-14 13:09:44.738623671 +0000 UTC m=+213.982782870" Oct 14 13:09:44.752193 master-2 kubenswrapper[4762]: I1014 13:09:44.752082 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.752193 master-2 kubenswrapper[4762]: I1014 13:09:44.752149 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9t24\" (UniqueName: \"kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.758474 master-2 kubenswrapper[4762]: I1014 13:09:44.758423 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.758996 master-2 kubenswrapper[4762]: I1014 13:09:44.758931 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-56d8dcb55c-h25c4" podStartSLOduration=4.361714062 podStartE2EDuration="5.758915618s" podCreationTimestamp="2025-10-14 13:09:39 +0000 UTC" firstStartedPulling="2025-10-14 13:09:42.221779732 +0000 UTC m=+211.465938881" lastFinishedPulling="2025-10-14 13:09:43.618981248 +0000 UTC m=+212.863140437" observedRunningTime="2025-10-14 13:09:44.755872801 +0000 UTC m=+214.000031980" watchObservedRunningTime="2025-10-14 13:09:44.758915618 +0000 UTC m=+214.003074797" Oct 14 13:09:44.775164 master-2 kubenswrapper[4762]: I1014 13:09:44.775114 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9t24\" (UniqueName: \"kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24\") pod \"multus-admission-controller-7b6b7bb859-vrzvk\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.866948 master-2 kubenswrapper[4762]: I1014 13:09:44.866801 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:09:44.896471 master-2 kubenswrapper[4762]: I1014 13:09:44.896421 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:44.896471 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:44.896471 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:44.896471 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:44.896726 master-2 kubenswrapper[4762]: I1014 13:09:44.896474 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:45.130945 master-2 kubenswrapper[4762]: I1014 13:09:45.130795 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn"] Oct 14 13:09:45.131745 master-2 kubenswrapper[4762]: I1014 13:09:45.131715 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.136881 master-2 kubenswrapper[4762]: I1014 13:09:45.136474 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Oct 14 13:09:45.136881 master-2 kubenswrapper[4762]: I1014 13:09:45.136551 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Oct 14 13:09:45.140479 master-2 kubenswrapper[4762]: I1014 13:09:45.139543 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Oct 14 13:09:45.140479 master-2 kubenswrapper[4762]: I1014 13:09:45.140126 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Oct 14 13:09:45.140479 master-2 kubenswrapper[4762]: I1014 13:09:45.140268 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Oct 14 13:09:45.141751 master-2 kubenswrapper[4762]: I1014 13:09:45.141702 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn"] Oct 14 13:09:45.143349 master-2 kubenswrapper[4762]: I1014 13:09:45.143302 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Oct 14 13:09:45.266000 master-2 kubenswrapper[4762]: I1014 13:09:45.265911 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:09:45.266610 master-2 kubenswrapper[4762]: I1014 13:09:45.266571 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.269715 master-2 kubenswrapper[4762]: I1014 13:09:45.269390 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.269711 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.270915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.270971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-metrics-client-ca\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271023 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555ls\" (UniqueName: \"kubernetes.io/projected/97348241-e1d9-4dd5-bcaa-762088570022-kube-api-access-555ls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271271 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271318 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271266 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271531 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-2hutru8havafv" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271602 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-federate-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271718 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271825 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-serving-certs-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.272446 master-2 kubenswrapper[4762]: I1014 13:09:45.271899 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.305595 master-2 kubenswrapper[4762]: I1014 13:09:45.275172 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:09:45.348233 master-2 kubenswrapper[4762]: I1014 13:09:45.347825 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:09:45.356953 master-2 kubenswrapper[4762]: W1014 13:09:45.356887 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19cccdb_ac9b_4919_85d8_d7ae33d2d003.slice/crio-830f07bc415dab46f5d47ff4e752b5723c9ad2b1073b43f1e1ddd13a0813a515 WatchSource:0}: Error finding container 830f07bc415dab46f5d47ff4e752b5723c9ad2b1073b43f1e1ddd13a0813a515: Status 404 returned error can't find the container with id 830f07bc415dab46f5d47ff4e752b5723c9ad2b1073b43f1e1ddd13a0813a515 Oct 14 13:09:45.374693 master-2 kubenswrapper[4762]: I1014 13:09:45.374569 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.374955 master-2 kubenswrapper[4762]: I1014 13:09:45.374803 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.374955 master-2 kubenswrapper[4762]: I1014 13:09:45.374853 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375146 master-2 kubenswrapper[4762]: I1014 13:09:45.375091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tx5d\" (UniqueName: \"kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375245 master-2 kubenswrapper[4762]: I1014 13:09:45.375212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-metrics-client-ca\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375361 master-2 kubenswrapper[4762]: I1014 13:09:45.375330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555ls\" (UniqueName: \"kubernetes.io/projected/97348241-e1d9-4dd5-bcaa-762088570022-kube-api-access-555ls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375410 master-2 kubenswrapper[4762]: I1014 13:09:45.375390 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375443 master-2 kubenswrapper[4762]: I1014 13:09:45.375421 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375477 master-2 kubenswrapper[4762]: I1014 13:09:45.375447 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375477 master-2 kubenswrapper[4762]: I1014 13:09:45.375470 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375632 master-2 kubenswrapper[4762]: I1014 13:09:45.375581 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-federate-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375705 master-2 kubenswrapper[4762]: I1014 13:09:45.375679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375775 master-2 kubenswrapper[4762]: I1014 13:09:45.375750 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.375837 master-2 kubenswrapper[4762]: I1014 13:09:45.375809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-serving-certs-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.375966 master-2 kubenswrapper[4762]: I1014 13:09:45.375852 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.377234 master-2 kubenswrapper[4762]: I1014 13:09:45.376913 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-metrics-client-ca\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.377898 master-2 kubenswrapper[4762]: I1014 13:09:45.377858 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.379829 master-2 kubenswrapper[4762]: I1014 13:09:45.379789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97348241-e1d9-4dd5-bcaa-762088570022-serving-certs-ca-bundle\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.381508 master-2 kubenswrapper[4762]: I1014 13:09:45.381414 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-federate-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.381605 master-2 kubenswrapper[4762]: I1014 13:09:45.381558 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.382012 master-2 kubenswrapper[4762]: I1014 13:09:45.381975 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-telemeter-client-tls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.382881 master-2 kubenswrapper[4762]: I1014 13:09:45.382835 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97348241-e1d9-4dd5-bcaa-762088570022-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.407043 master-2 kubenswrapper[4762]: I1014 13:09:45.405433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555ls\" (UniqueName: \"kubernetes.io/projected/97348241-e1d9-4dd5-bcaa-762088570022-kube-api-access-555ls\") pod \"telemeter-client-56c4f9c4b6-s6gwn\" (UID: \"97348241-e1d9-4dd5-bcaa-762088570022\") " pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.460896 master-2 kubenswrapper[4762]: I1014 13:09:45.455630 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" Oct 14 13:09:45.460896 master-2 kubenswrapper[4762]: I1014 13:09:45.460468 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:45.460896 master-2 kubenswrapper[4762]: I1014 13:09:45.460500 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:45.477041 master-2 kubenswrapper[4762]: I1014 13:09:45.476995 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477136 master-2 kubenswrapper[4762]: I1014 13:09:45.477062 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477136 master-2 kubenswrapper[4762]: I1014 13:09:45.477087 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477268 master-2 kubenswrapper[4762]: I1014 13:09:45.477170 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477268 master-2 kubenswrapper[4762]: I1014 13:09:45.477230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477362 master-2 kubenswrapper[4762]: I1014 13:09:45.477266 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.477713 master-2 kubenswrapper[4762]: I1014 13:09:45.477659 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.478928 master-2 kubenswrapper[4762]: I1014 13:09:45.478890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.479032 master-2 kubenswrapper[4762]: I1014 13:09:45.478949 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.479032 master-2 kubenswrapper[4762]: I1014 13:09:45.479015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tx5d\" (UniqueName: \"kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.481137 master-2 kubenswrapper[4762]: I1014 13:09:45.480924 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.483232 master-2 kubenswrapper[4762]: I1014 13:09:45.483184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.493929 master-2 kubenswrapper[4762]: I1014 13:09:45.493880 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.499416 master-2 kubenswrapper[4762]: I1014 13:09:45.499362 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tx5d\" (UniqueName: \"kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d\") pod \"metrics-server-8475fbcb68-8dq9n\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.513828 master-2 kubenswrapper[4762]: I1014 13:09:45.513770 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:45.627182 master-2 kubenswrapper[4762]: I1014 13:09:45.626601 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:45.680702 master-2 kubenswrapper[4762]: I1014 13:09:45.680623 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerStarted","Data":"830f07bc415dab46f5d47ff4e752b5723c9ad2b1073b43f1e1ddd13a0813a515"} Oct 14 13:09:45.853994 master-2 kubenswrapper[4762]: I1014 13:09:45.853911 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn"] Oct 14 13:09:45.896529 master-2 kubenswrapper[4762]: I1014 13:09:45.896362 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:45.896529 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:45.896529 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:45.896529 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:45.896529 master-2 kubenswrapper[4762]: I1014 13:09:45.896447 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:46.036260 master-2 kubenswrapper[4762]: I1014 13:09:46.032776 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:09:46.667178 master-2 kubenswrapper[4762]: I1014 13:09:46.667076 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:46.667178 master-2 kubenswrapper[4762]: I1014 13:09:46.667138 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:46.690805 master-2 kubenswrapper[4762]: I1014 13:09:46.690683 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" event={"ID":"949ffee6-8997-4b92-84c3-4aeb1121bbe1","Type":"ContainerStarted","Data":"7b246d3ce744eca8eaaea6e71479fe531a610918f55baa77f52873670b4b79e9"} Oct 14 13:09:46.692195 master-2 kubenswrapper[4762]: I1014 13:09:46.692100 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" event={"ID":"97348241-e1d9-4dd5-bcaa-762088570022","Type":"ContainerStarted","Data":"d4176aad017a0e5b97410b3d756eaca85f23bf796d9dffc3b17f5e61ba2b7db5"} Oct 14 13:09:46.898421 master-2 kubenswrapper[4762]: I1014 13:09:46.898345 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:46.898421 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:46.898421 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:46.898421 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:46.898421 master-2 kubenswrapper[4762]: I1014 13:09:46.898432 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:47.175537 master-2 kubenswrapper[4762]: I1014 13:09:47.175494 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ddb89f76-887cs_f82e0c58-e2a3-491a-bf03-ad47b38c5833/router/0.log" Oct 14 13:09:47.552207 master-2 kubenswrapper[4762]: I1014 13:09:47.552133 4762 scope.go:117] "RemoveContainer" containerID="0a39b10e2fb121ce0536d44cd9b7911391264299af43b6661cf503ece1eed26a" Oct 14 13:09:47.571215 master-2 kubenswrapper[4762]: I1014 13:09:47.571048 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-c57444595-mj7cx_2b69dba3-5ac1-4eb9-bba6-0d0662ab8544/fix-audit-permissions/0.log" Oct 14 13:09:47.702068 master-2 kubenswrapper[4762]: I1014 13:09:47.701929 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerStarted","Data":"c8dfd15a32985bd086d3b821b1137ea95df04ba46fa985bc8180bf62869ea7f9"} Oct 14 13:09:47.702068 master-2 kubenswrapper[4762]: I1014 13:09:47.701988 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerStarted","Data":"b4d790b1636493087694498f992b9121f3a37a50f6ab46979d6eb4f576d882ee"} Oct 14 13:09:47.715343 master-2 kubenswrapper[4762]: I1014 13:09:47.715295 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" podStartSLOduration=1.8851686600000002 podStartE2EDuration="3.71528262s" podCreationTimestamp="2025-10-14 13:09:44 +0000 UTC" firstStartedPulling="2025-10-14 13:09:45.358981706 +0000 UTC m=+214.603140865" lastFinishedPulling="2025-10-14 13:09:47.189095656 +0000 UTC m=+216.433254825" observedRunningTime="2025-10-14 13:09:47.712699068 +0000 UTC m=+216.956858237" watchObservedRunningTime="2025-10-14 13:09:47.71528262 +0000 UTC m=+216.959441779" Oct 14 13:09:47.726793 master-2 kubenswrapper[4762]: I1014 13:09:47.726735 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xl9gv" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="registry-server" probeResult="failure" output=< Oct 14 13:09:47.726793 master-2 kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Oct 14 13:09:47.726793 master-2 kubenswrapper[4762]: > Oct 14 13:09:47.779789 master-2 kubenswrapper[4762]: I1014 13:09:47.779711 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-c57444595-mj7cx_2b69dba3-5ac1-4eb9-bba6-0d0662ab8544/oauth-apiserver/0.log" Oct 14 13:09:47.897742 master-2 kubenswrapper[4762]: I1014 13:09:47.897641 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:47.897742 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:47.897742 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:47.897742 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:47.899280 master-2 kubenswrapper[4762]: I1014 13:09:47.897765 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:48.897873 master-2 kubenswrapper[4762]: I1014 13:09:48.897775 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:48.897873 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:48.897873 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:48.897873 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:48.897873 master-2 kubenswrapper[4762]: I1014 13:09:48.897845 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:49.719837 master-2 kubenswrapper[4762]: I1014 13:09:49.719748 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" event={"ID":"949ffee6-8997-4b92-84c3-4aeb1121bbe1","Type":"ContainerStarted","Data":"bbd14ec96da76e6b6b207839405f5858f9bdefe7cec9b0ffa533a5f314702f25"} Oct 14 13:09:49.720237 master-2 kubenswrapper[4762]: I1014 13:09:49.720066 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:09:49.722310 master-2 kubenswrapper[4762]: I1014 13:09:49.722282 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 13:09:49.723625 master-2 kubenswrapper[4762]: I1014 13:09:49.723571 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerStarted","Data":"c9ae0e5b574ac8a99ca0949d5b2df3ccae6fbedffdb7caebf8001e57fc28991f"} Oct 14 13:09:49.726561 master-2 kubenswrapper[4762]: I1014 13:09:49.726514 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" event={"ID":"97348241-e1d9-4dd5-bcaa-762088570022","Type":"ContainerStarted","Data":"8430d84849e76014bc106feba0b9fe13369309504b226957f5e457b302d616e1"} Oct 14 13:09:49.741813 master-2 kubenswrapper[4762]: I1014 13:09:49.741711 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podStartSLOduration=2.137625966 podStartE2EDuration="4.741692674s" podCreationTimestamp="2025-10-14 13:09:45 +0000 UTC" firstStartedPulling="2025-10-14 13:09:46.045857341 +0000 UTC m=+215.290016540" lastFinishedPulling="2025-10-14 13:09:48.649924079 +0000 UTC m=+217.894083248" observedRunningTime="2025-10-14 13:09:49.739807514 +0000 UTC m=+218.983966743" watchObservedRunningTime="2025-10-14 13:09:49.741692674 +0000 UTC m=+218.985851833" Oct 14 13:09:49.758895 master-2 kubenswrapper[4762]: I1014 13:09:49.758811 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" podStartSLOduration=176.758780098 podStartE2EDuration="2m56.758780098s" podCreationTimestamp="2025-10-14 13:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:09:49.756357371 +0000 UTC m=+219.000516590" watchObservedRunningTime="2025-10-14 13:09:49.758780098 +0000 UTC m=+219.002939287" Oct 14 13:09:49.897723 master-2 kubenswrapper[4762]: I1014 13:09:49.897628 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:49.897723 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:49.897723 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:49.897723 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:49.897723 master-2 kubenswrapper[4762]: I1014 13:09:49.897722 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:50.736819 master-2 kubenswrapper[4762]: I1014 13:09:50.736680 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" event={"ID":"97348241-e1d9-4dd5-bcaa-762088570022","Type":"ContainerStarted","Data":"a0b97ffaf02ca4c84099de34760deddff8d0701f2f0d92823405d8ef28a95506"} Oct 14 13:09:50.736819 master-2 kubenswrapper[4762]: I1014 13:09:50.736802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" event={"ID":"97348241-e1d9-4dd5-bcaa-762088570022","Type":"ContainerStarted","Data":"69991d183090e5cc61f089142cc9dc25bee6e12391de6ceb0baeafde17d487e9"} Oct 14 13:09:50.761750 master-2 kubenswrapper[4762]: I1014 13:09:50.761650 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56c4f9c4b6-s6gwn" podStartSLOduration=1.287118068 podStartE2EDuration="5.76162396s" podCreationTimestamp="2025-10-14 13:09:45 +0000 UTC" firstStartedPulling="2025-10-14 13:09:45.867468577 +0000 UTC m=+215.111627756" lastFinishedPulling="2025-10-14 13:09:50.341974489 +0000 UTC m=+219.586133648" observedRunningTime="2025-10-14 13:09:50.758583563 +0000 UTC m=+220.002742812" watchObservedRunningTime="2025-10-14 13:09:50.76162396 +0000 UTC m=+220.005783129" Oct 14 13:09:50.897822 master-2 kubenswrapper[4762]: I1014 13:09:50.897754 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:50.897822 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:50.897822 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:50.897822 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:50.898874 master-2 kubenswrapper[4762]: I1014 13:09:50.897841 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:50.975288 master-2 kubenswrapper[4762]: I1014 13:09:50.975135 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ddb89f76-887cs_f82e0c58-e2a3-491a-bf03-ad47b38c5833/router/0.log" Oct 14 13:09:51.898447 master-2 kubenswrapper[4762]: I1014 13:09:51.898302 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:51.898447 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:51.898447 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:51.898447 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:51.899735 master-2 kubenswrapper[4762]: I1014 13:09:51.898449 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:52.898592 master-2 kubenswrapper[4762]: I1014 13:09:52.898474 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:52.898592 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:52.898592 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:52.898592 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:52.898592 master-2 kubenswrapper[4762]: I1014 13:09:52.898579 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:53.499672 master-2 kubenswrapper[4762]: E1014 13:09:53.499601 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:53.501290 master-2 kubenswrapper[4762]: E1014 13:09:53.501220 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:53.502465 master-2 kubenswrapper[4762]: E1014 13:09:53.502404 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:09:53.502465 master-2 kubenswrapper[4762]: E1014 13:09:53.502451 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:09:53.866699 master-2 kubenswrapper[4762]: I1014 13:09:53.866489 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:53.866699 master-2 kubenswrapper[4762]: I1014 13:09:53.866594 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:53.897979 master-2 kubenswrapper[4762]: I1014 13:09:53.897870 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:53.897979 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:53.897979 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:53.897979 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:53.897979 master-2 kubenswrapper[4762]: I1014 13:09:53.897973 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:53.930186 master-2 kubenswrapper[4762]: I1014 13:09:53.930103 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:54.065603 master-2 kubenswrapper[4762]: I1014 13:09:54.065514 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:54.066104 master-2 kubenswrapper[4762]: I1014 13:09:54.066058 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:54.132702 master-2 kubenswrapper[4762]: I1014 13:09:54.132459 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:54.818271 master-2 kubenswrapper[4762]: I1014 13:09:54.818204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:09:54.819387 master-2 kubenswrapper[4762]: I1014 13:09:54.819351 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:09:54.897858 master-2 kubenswrapper[4762]: I1014 13:09:54.897781 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:54.897858 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:54.897858 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:54.897858 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:54.898210 master-2 kubenswrapper[4762]: I1014 13:09:54.897882 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:55.168240 master-2 kubenswrapper[4762]: I1014 13:09:55.168139 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/fix-audit-permissions/0.log" Oct 14 13:09:55.376235 master-2 kubenswrapper[4762]: I1014 13:09:55.376097 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/openshift-apiserver/0.log" Oct 14 13:09:55.515249 master-2 kubenswrapper[4762]: I1014 13:09:55.515007 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:09:55.573307 master-2 kubenswrapper[4762]: I1014 13:09:55.573196 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-6576f6bc9d-r2fhv_3964407d-3235-4331-bee0-0188f908f6c8/openshift-apiserver-check-endpoints/0.log" Oct 14 13:09:55.897753 master-2 kubenswrapper[4762]: I1014 13:09:55.897476 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:55.897753 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:55.897753 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:55.897753 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:55.897753 master-2 kubenswrapper[4762]: I1014 13:09:55.897599 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:56.230404 master-2 kubenswrapper[4762]: I1014 13:09:56.230328 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:09:56.231510 master-2 kubenswrapper[4762]: E1014 13:09:56.230516 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:09:56.231632 master-2 kubenswrapper[4762]: E1014 13:09:56.231598 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:11:00.231558667 +0000 UTC m=+289.475717866 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:09:56.716118 master-2 kubenswrapper[4762]: I1014 13:09:56.716042 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:56.762277 master-2 kubenswrapper[4762]: I1014 13:09:56.762219 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:09:56.897487 master-2 kubenswrapper[4762]: I1014 13:09:56.897359 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:56.897487 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:56.897487 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:56.897487 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:56.897832 master-2 kubenswrapper[4762]: I1014 13:09:56.897500 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:57.899993 master-2 kubenswrapper[4762]: I1014 13:09:57.899902 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:57.899993 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:57.899993 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:57.899993 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:57.899993 master-2 kubenswrapper[4762]: I1014 13:09:57.899976 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:58.772852 master-2 kubenswrapper[4762]: I1014 13:09:58.772763 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-6f5778dccb-9sfms_15a54d1d-6715-4afe-b6aa-8765dc254e96/packageserver/0.log" Oct 14 13:09:58.897050 master-2 kubenswrapper[4762]: I1014 13:09:58.896980 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:58.897050 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:58.897050 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:58.897050 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:58.897432 master-2 kubenswrapper[4762]: I1014 13:09:58.897068 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:09:59.897842 master-2 kubenswrapper[4762]: I1014 13:09:59.897766 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:09:59.897842 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:09:59.897842 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:09:59.897842 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:09:59.898926 master-2 kubenswrapper[4762]: I1014 13:09:59.897870 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:00.898343 master-2 kubenswrapper[4762]: I1014 13:10:00.898199 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:00.898343 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:00.898343 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:00.898343 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:00.899586 master-2 kubenswrapper[4762]: I1014 13:10:00.898379 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:01.897269 master-2 kubenswrapper[4762]: I1014 13:10:01.897197 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:01.897269 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:01.897269 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:01.897269 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:01.897269 master-2 kubenswrapper[4762]: I1014 13:10:01.897261 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:02.898632 master-2 kubenswrapper[4762]: I1014 13:10:02.898542 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:02.898632 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:02.898632 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:02.898632 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:02.898632 master-2 kubenswrapper[4762]: I1014 13:10:02.898631 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:03.500459 master-2 kubenswrapper[4762]: E1014 13:10:03.500356 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:10:03.501914 master-2 kubenswrapper[4762]: E1014 13:10:03.501859 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:10:03.503693 master-2 kubenswrapper[4762]: E1014 13:10:03.503640 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 14 13:10:03.503817 master-2 kubenswrapper[4762]: E1014 13:10:03.503715 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:10:03.896661 master-2 kubenswrapper[4762]: I1014 13:10:03.896591 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:03.896661 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:03.896661 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:03.896661 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:03.897076 master-2 kubenswrapper[4762]: I1014 13:10:03.896681 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:04.897735 master-2 kubenswrapper[4762]: I1014 13:10:04.897646 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:04.897735 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:04.897735 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:04.897735 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:04.897735 master-2 kubenswrapper[4762]: I1014 13:10:04.897715 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:05.896804 master-2 kubenswrapper[4762]: I1014 13:10:05.896729 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:05.896804 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:05.896804 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:05.896804 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:05.896804 master-2 kubenswrapper[4762]: I1014 13:10:05.896800 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:06.696077 master-2 kubenswrapper[4762]: I1014 13:10:06.696032 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-sg92v_970e8ee9-e505-4a07-9662-362652cf6b3b/kube-multus-additional-cni-plugins/0.log" Oct 14 13:10:06.696600 master-2 kubenswrapper[4762]: I1014 13:10:06.696125 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:10:06.825123 master-2 kubenswrapper[4762]: I1014 13:10:06.824995 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-sg92v_970e8ee9-e505-4a07-9662-362652cf6b3b/kube-multus-additional-cni-plugins/0.log" Oct 14 13:10:06.825412 master-2 kubenswrapper[4762]: I1014 13:10:06.825386 4762 generic.go:334] "Generic (PLEG): container finished" podID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" exitCode=137 Oct 14 13:10:06.825514 master-2 kubenswrapper[4762]: I1014 13:10:06.825479 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" Oct 14 13:10:06.826131 master-2 kubenswrapper[4762]: I1014 13:10:06.825494 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" event={"ID":"970e8ee9-e505-4a07-9662-362652cf6b3b","Type":"ContainerDied","Data":"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2"} Oct 14 13:10:06.826276 master-2 kubenswrapper[4762]: I1014 13:10:06.826256 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-sg92v" event={"ID":"970e8ee9-e505-4a07-9662-362652cf6b3b","Type":"ContainerDied","Data":"b2752f3c8de711a76ebcd5756bfc5b53dacbe058eb3bbbdf9186c5ff14187ae3"} Oct 14 13:10:06.826391 master-2 kubenswrapper[4762]: I1014 13:10:06.826360 4762 scope.go:117] "RemoveContainer" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" Oct 14 13:10:06.841715 master-2 kubenswrapper[4762]: I1014 13:10:06.841631 4762 scope.go:117] "RemoveContainer" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" Oct 14 13:10:06.842342 master-2 kubenswrapper[4762]: E1014 13:10:06.842297 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2\": container with ID starting with b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2 not found: ID does not exist" containerID="b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2" Oct 14 13:10:06.842503 master-2 kubenswrapper[4762]: I1014 13:10:06.842450 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2"} err="failed to get container status \"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2\": rpc error: code = NotFound desc = could not find container \"b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2\": container with ID starting with b371d25e7f0e819ea7586a72533618b13846c1d5a3fdddd549ef75da6a9613c2 not found: ID does not exist" Oct 14 13:10:06.863907 master-2 kubenswrapper[4762]: I1014 13:10:06.863855 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready\") pod \"970e8ee9-e505-4a07-9662-362652cf6b3b\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " Oct 14 13:10:06.864268 master-2 kubenswrapper[4762]: I1014 13:10:06.864248 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir\") pod \"970e8ee9-e505-4a07-9662-362652cf6b3b\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " Oct 14 13:10:06.864466 master-2 kubenswrapper[4762]: I1014 13:10:06.864450 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist\") pod \"970e8ee9-e505-4a07-9662-362652cf6b3b\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " Oct 14 13:10:06.864639 master-2 kubenswrapper[4762]: I1014 13:10:06.864386 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "970e8ee9-e505-4a07-9662-362652cf6b3b" (UID: "970e8ee9-e505-4a07-9662-362652cf6b3b"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:10:06.864751 master-2 kubenswrapper[4762]: I1014 13:10:06.864425 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready" (OuterVolumeSpecName: "ready") pod "970e8ee9-e505-4a07-9662-362652cf6b3b" (UID: "970e8ee9-e505-4a07-9662-362652cf6b3b"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:10:06.864842 master-2 kubenswrapper[4762]: I1014 13:10:06.864740 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2n2\" (UniqueName: \"kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2\") pod \"970e8ee9-e505-4a07-9662-362652cf6b3b\" (UID: \"970e8ee9-e505-4a07-9662-362652cf6b3b\") " Oct 14 13:10:06.865083 master-2 kubenswrapper[4762]: I1014 13:10:06.865035 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "970e8ee9-e505-4a07-9662-362652cf6b3b" (UID: "970e8ee9-e505-4a07-9662-362652cf6b3b"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:10:06.865297 master-2 kubenswrapper[4762]: I1014 13:10:06.865278 4762 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/970e8ee9-e505-4a07-9662-362652cf6b3b-ready\") on node \"master-2\" DevicePath \"\"" Oct 14 13:10:06.865384 master-2 kubenswrapper[4762]: I1014 13:10:06.865373 4762 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/970e8ee9-e505-4a07-9662-362652cf6b3b-tuning-conf-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:10:06.865446 master-2 kubenswrapper[4762]: I1014 13:10:06.865437 4762 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/970e8ee9-e505-4a07-9662-362652cf6b3b-cni-sysctl-allowlist\") on node \"master-2\" DevicePath \"\"" Oct 14 13:10:06.867923 master-2 kubenswrapper[4762]: I1014 13:10:06.867887 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2" (OuterVolumeSpecName: "kube-api-access-rf2n2") pod "970e8ee9-e505-4a07-9662-362652cf6b3b" (UID: "970e8ee9-e505-4a07-9662-362652cf6b3b"). InnerVolumeSpecName "kube-api-access-rf2n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:10:06.897681 master-2 kubenswrapper[4762]: I1014 13:10:06.897615 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:06.897681 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:06.897681 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:06.897681 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:06.898247 master-2 kubenswrapper[4762]: I1014 13:10:06.898200 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:06.966141 master-2 kubenswrapper[4762]: I1014 13:10:06.966071 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2n2\" (UniqueName: \"kubernetes.io/projected/970e8ee9-e505-4a07-9662-362652cf6b3b-kube-api-access-rf2n2\") on node \"master-2\" DevicePath \"\"" Oct 14 13:10:07.198835 master-2 kubenswrapper[4762]: I1014 13:10:07.198731 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-sg92v"] Oct 14 13:10:07.201963 master-2 kubenswrapper[4762]: I1014 13:10:07.201866 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-sg92v"] Oct 14 13:10:07.558274 master-2 kubenswrapper[4762]: I1014 13:10:07.558195 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" path="/var/lib/kubelet/pods/970e8ee9-e505-4a07-9662-362652cf6b3b/volumes" Oct 14 13:10:07.898132 master-2 kubenswrapper[4762]: I1014 13:10:07.897909 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:07.898132 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:07.898132 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:07.898132 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:07.898132 master-2 kubenswrapper[4762]: I1014 13:10:07.898012 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:08.897266 master-2 kubenswrapper[4762]: I1014 13:10:08.897180 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:08.897266 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:08.897266 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:08.897266 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:08.897899 master-2 kubenswrapper[4762]: I1014 13:10:08.897287 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:09.897424 master-2 kubenswrapper[4762]: I1014 13:10:09.897353 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:09.897424 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:09.897424 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:09.897424 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:09.898065 master-2 kubenswrapper[4762]: I1014 13:10:09.897450 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:10.105309 master-2 kubenswrapper[4762]: I1014 13:10:10.105259 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:10:10.898141 master-2 kubenswrapper[4762]: I1014 13:10:10.898066 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:10.898141 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:10.898141 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:10.898141 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:10.898979 master-2 kubenswrapper[4762]: I1014 13:10:10.898149 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:11.466489 master-2 kubenswrapper[4762]: I1014 13:10:11.466360 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:10:11.466728 master-2 kubenswrapper[4762]: I1014 13:10:11.466598 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" containerID="cri-o://03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9" gracePeriod=120 Oct 14 13:10:11.897104 master-2 kubenswrapper[4762]: I1014 13:10:11.896970 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:11.897104 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:11.897104 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:11.897104 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:11.897104 master-2 kubenswrapper[4762]: I1014 13:10:11.897046 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:12.897260 master-2 kubenswrapper[4762]: I1014 13:10:12.897175 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:12.897260 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:12.897260 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:12.897260 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:12.897822 master-2 kubenswrapper[4762]: I1014 13:10:12.897263 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:13.897106 master-2 kubenswrapper[4762]: I1014 13:10:13.896968 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:13.897106 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:13.897106 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:13.897106 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:13.897106 master-2 kubenswrapper[4762]: I1014 13:10:13.897032 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: I1014 13:10:13.968718 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:13.968770 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:13.969657 master-2 kubenswrapper[4762]: I1014 13:10:13.968781 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:14.897674 master-2 kubenswrapper[4762]: I1014 13:10:14.897563 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:14.897674 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:14.897674 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:14.897674 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:14.898529 master-2 kubenswrapper[4762]: I1014 13:10:14.897692 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:15.897567 master-2 kubenswrapper[4762]: I1014 13:10:15.897494 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:15.897567 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:15.897567 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:15.897567 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:15.898330 master-2 kubenswrapper[4762]: I1014 13:10:15.897580 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:16.897961 master-2 kubenswrapper[4762]: I1014 13:10:16.897893 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:16.897961 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:16.897961 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:16.897961 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:16.897961 master-2 kubenswrapper[4762]: I1014 13:10:16.897962 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:17.897821 master-2 kubenswrapper[4762]: I1014 13:10:17.897718 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:17.897821 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:17.897821 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:17.897821 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:17.897821 master-2 kubenswrapper[4762]: I1014 13:10:17.897784 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:18.897744 master-2 kubenswrapper[4762]: I1014 13:10:18.897625 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:18.897744 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:18.897744 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:18.897744 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:18.897744 master-2 kubenswrapper[4762]: I1014 13:10:18.897705 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: I1014 13:10:18.969780 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:18.969888 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:18.971548 master-2 kubenswrapper[4762]: I1014 13:10:18.969891 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:19.897191 master-2 kubenswrapper[4762]: I1014 13:10:19.897110 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:19.897191 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:19.897191 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:19.897191 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:19.897668 master-2 kubenswrapper[4762]: I1014 13:10:19.897202 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:20.897185 master-2 kubenswrapper[4762]: I1014 13:10:20.897074 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:20.897185 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:20.897185 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:20.897185 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:20.898056 master-2 kubenswrapper[4762]: I1014 13:10:20.897200 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:21.896880 master-2 kubenswrapper[4762]: I1014 13:10:21.896762 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:21.896880 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:21.896880 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:21.896880 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:21.896880 master-2 kubenswrapper[4762]: I1014 13:10:21.896862 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:22.897180 master-2 kubenswrapper[4762]: I1014 13:10:22.897045 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:22.897180 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:22.897180 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:22.897180 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:22.897931 master-2 kubenswrapper[4762]: I1014 13:10:22.897146 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:23.897520 master-2 kubenswrapper[4762]: I1014 13:10:23.897426 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:23.897520 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:23.897520 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:23.897520 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:23.898215 master-2 kubenswrapper[4762]: I1014 13:10:23.897542 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: I1014 13:10:23.967023 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:23.967108 master-2 kubenswrapper[4762]: I1014 13:10:23.967097 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:23.967805 master-2 kubenswrapper[4762]: I1014 13:10:23.967219 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:10:24.900444 master-2 kubenswrapper[4762]: I1014 13:10:24.900343 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:24.900444 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:24.900444 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:24.900444 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:24.900444 master-2 kubenswrapper[4762]: I1014 13:10:24.900481 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:25.634926 master-2 kubenswrapper[4762]: I1014 13:10:25.634820 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:10:25.900063 master-2 kubenswrapper[4762]: I1014 13:10:25.899888 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:25.900063 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:25.900063 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:25.900063 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:25.900063 master-2 kubenswrapper[4762]: I1014 13:10:25.899988 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:26.902882 master-2 kubenswrapper[4762]: I1014 13:10:26.901394 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:26.902882 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:26.902882 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:26.902882 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:26.902882 master-2 kubenswrapper[4762]: I1014 13:10:26.902192 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:27.703587 master-2 kubenswrapper[4762]: I1014 13:10:27.703497 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:10:27.703889 master-2 kubenswrapper[4762]: E1014 13:10:27.703733 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:10:27.703889 master-2 kubenswrapper[4762]: I1014 13:10:27.703756 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:10:27.704025 master-2 kubenswrapper[4762]: I1014 13:10:27.703913 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="970e8ee9-e505-4a07-9662-362652cf6b3b" containerName="kube-multus-additional-cni-plugins" Oct 14 13:10:27.704542 master-2 kubenswrapper[4762]: I1014 13:10:27.704501 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.707366 master-2 kubenswrapper[4762]: I1014 13:10:27.707308 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Oct 14 13:10:27.710879 master-2 kubenswrapper[4762]: I1014 13:10:27.710793 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.711057 master-2 kubenswrapper[4762]: I1014 13:10:27.710914 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.711337 master-2 kubenswrapper[4762]: I1014 13:10:27.711290 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.712607 master-2 kubenswrapper[4762]: I1014 13:10:27.712554 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:10:27.812837 master-2 kubenswrapper[4762]: I1014 13:10:27.812739 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.812837 master-2 kubenswrapper[4762]: I1014 13:10:27.812837 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.813240 master-2 kubenswrapper[4762]: I1014 13:10:27.812898 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.813240 master-2 kubenswrapper[4762]: I1014 13:10:27.812983 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.813240 master-2 kubenswrapper[4762]: I1014 13:10:27.813096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.834254 master-2 kubenswrapper[4762]: I1014 13:10:27.834176 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access\") pod \"installer-2-master-2\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:27.897877 master-2 kubenswrapper[4762]: I1014 13:10:27.897793 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:27.897877 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:27.897877 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:27.897877 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:27.897877 master-2 kubenswrapper[4762]: I1014 13:10:27.897861 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:28.031479 master-2 kubenswrapper[4762]: I1014 13:10:28.031288 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-2" Oct 14 13:10:28.504228 master-2 kubenswrapper[4762]: I1014 13:10:28.504136 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:10:28.516449 master-2 kubenswrapper[4762]: W1014 13:10:28.512468 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7fbbe25e_c819_4f38_a358_ee552afdaa22.slice/crio-b4e0833932ee22e0c27ed1cf773f45a44ea75b33c8c5b04b544cdb27fcf011ef WatchSource:0}: Error finding container b4e0833932ee22e0c27ed1cf773f45a44ea75b33c8c5b04b544cdb27fcf011ef: Status 404 returned error can't find the container with id b4e0833932ee22e0c27ed1cf773f45a44ea75b33c8c5b04b544cdb27fcf011ef Oct 14 13:10:28.897773 master-2 kubenswrapper[4762]: I1014 13:10:28.897573 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:28.897773 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:28.897773 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:28.897773 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:28.897773 master-2 kubenswrapper[4762]: I1014 13:10:28.897645 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:28.934373 master-2 kubenswrapper[4762]: I1014 13:10:28.934290 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-2" event={"ID":"7fbbe25e-c819-4f38-a358-ee552afdaa22","Type":"ContainerStarted","Data":"b4e0833932ee22e0c27ed1cf773f45a44ea75b33c8c5b04b544cdb27fcf011ef"} Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: I1014 13:10:28.966785 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:28.966904 master-2 kubenswrapper[4762]: I1014 13:10:28.966863 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:29.898599 master-2 kubenswrapper[4762]: I1014 13:10:29.898513 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:29.898599 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:29.898599 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:29.898599 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:29.899325 master-2 kubenswrapper[4762]: I1014 13:10:29.898632 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:30.897212 master-2 kubenswrapper[4762]: I1014 13:10:30.897111 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:30.897212 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:30.897212 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:30.897212 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:30.897635 master-2 kubenswrapper[4762]: I1014 13:10:30.897218 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:31.897539 master-2 kubenswrapper[4762]: I1014 13:10:31.897481 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:31.897539 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:31.897539 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:31.897539 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:31.898573 master-2 kubenswrapper[4762]: I1014 13:10:31.898390 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:31.955259 master-2 kubenswrapper[4762]: I1014 13:10:31.955129 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-2" event={"ID":"7fbbe25e-c819-4f38-a358-ee552afdaa22","Type":"ContainerStarted","Data":"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d"} Oct 14 13:10:31.981270 master-2 kubenswrapper[4762]: I1014 13:10:31.981097 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-2" podStartSLOduration=2.5418814259999998 podStartE2EDuration="4.981071801s" podCreationTimestamp="2025-10-14 13:10:27 +0000 UTC" firstStartedPulling="2025-10-14 13:10:28.51506315 +0000 UTC m=+257.759222309" lastFinishedPulling="2025-10-14 13:10:30.954253495 +0000 UTC m=+260.198412684" observedRunningTime="2025-10-14 13:10:31.977798567 +0000 UTC m=+261.221957776" watchObservedRunningTime="2025-10-14 13:10:31.981071801 +0000 UTC m=+261.225230990" Oct 14 13:10:32.898275 master-2 kubenswrapper[4762]: I1014 13:10:32.898151 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:32.898275 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:32.898275 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:32.898275 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:32.898275 master-2 kubenswrapper[4762]: I1014 13:10:32.898261 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:33.898025 master-2 kubenswrapper[4762]: I1014 13:10:33.897932 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:33.898025 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:33.898025 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:33.898025 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:33.898025 master-2 kubenswrapper[4762]: I1014 13:10:33.898016 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: I1014 13:10:33.969521 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:33.969611 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:33.970662 master-2 kubenswrapper[4762]: I1014 13:10:33.969645 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:34.897799 master-2 kubenswrapper[4762]: I1014 13:10:34.897668 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:34.897799 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:34.897799 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:34.897799 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:34.898980 master-2 kubenswrapper[4762]: I1014 13:10:34.897816 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:35.898251 master-2 kubenswrapper[4762]: I1014 13:10:35.898100 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:35.898251 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:35.898251 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:35.898251 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:35.898251 master-2 kubenswrapper[4762]: I1014 13:10:35.898200 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:36.102167 master-2 kubenswrapper[4762]: I1014 13:10:36.102084 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:10:36.102915 master-2 kubenswrapper[4762]: I1014 13:10:36.102881 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/installer-2-master-2" podUID="7fbbe25e-c819-4f38-a358-ee552afdaa22" containerName="installer" containerID="cri-o://8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d" gracePeriod=30 Oct 14 13:10:36.229847 master-2 kubenswrapper[4762]: E1014 13:10:36.229698 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" podUID="e3b0f97c-92e0-43c7-a72a-c003f0451347" Oct 14 13:10:36.897999 master-2 kubenswrapper[4762]: I1014 13:10:36.897929 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:36.897999 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:36.897999 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:36.897999 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:36.897999 master-2 kubenswrapper[4762]: I1014 13:10:36.897994 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:36.985960 master-2 kubenswrapper[4762]: I1014 13:10:36.985868 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:10:37.898436 master-2 kubenswrapper[4762]: I1014 13:10:37.898310 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:37.898436 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:37.898436 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:37.898436 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:37.898898 master-2 kubenswrapper[4762]: I1014 13:10:37.898484 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:38.897864 master-2 kubenswrapper[4762]: I1014 13:10:38.897789 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:38.897864 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:38.897864 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:38.897864 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:38.898587 master-2 kubenswrapper[4762]: I1014 13:10:38.897879 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: I1014 13:10:38.967918 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:38.968047 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:38.969036 master-2 kubenswrapper[4762]: I1014 13:10:38.968089 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:39.398133 master-2 kubenswrapper[4762]: I1014 13:10:39.398067 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-3-master-2"] Oct 14 13:10:39.398641 master-2 kubenswrapper[4762]: I1014 13:10:39.398597 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.451375 master-2 kubenswrapper[4762]: I1014 13:10:39.451307 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-3-master-2"] Oct 14 13:10:39.570102 master-2 kubenswrapper[4762]: I1014 13:10:39.569977 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.570446 master-2 kubenswrapper[4762]: I1014 13:10:39.570125 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.570446 master-2 kubenswrapper[4762]: I1014 13:10:39.570270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.671987 master-2 kubenswrapper[4762]: I1014 13:10:39.671913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.671987 master-2 kubenswrapper[4762]: I1014 13:10:39.671965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.672345 master-2 kubenswrapper[4762]: I1014 13:10:39.672005 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.672345 master-2 kubenswrapper[4762]: I1014 13:10:39.672075 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.672345 master-2 kubenswrapper[4762]: I1014 13:10:39.672067 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.692769 master-2 kubenswrapper[4762]: I1014 13:10:39.692668 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access\") pod \"installer-3-master-2\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.721509 master-2 kubenswrapper[4762]: I1014 13:10:39.721416 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-3-master-2" Oct 14 13:10:39.898479 master-2 kubenswrapper[4762]: I1014 13:10:39.898394 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:39.898479 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:39.898479 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:39.898479 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:39.899385 master-2 kubenswrapper[4762]: I1014 13:10:39.898496 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:39.962649 master-2 kubenswrapper[4762]: I1014 13:10:39.962476 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-3-master-2"] Oct 14 13:10:40.005997 master-2 kubenswrapper[4762]: I1014 13:10:40.005894 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-3-master-2" event={"ID":"ff4898af-987c-42c5-8728-033c5ede3e0f","Type":"ContainerStarted","Data":"2f6b446fab6e1dd8dac5d81ac83c005e16bc5d51cbc00adf1b9666244b0520dc"} Oct 14 13:10:40.113194 master-2 kubenswrapper[4762]: I1014 13:10:40.113092 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:10:40.897913 master-2 kubenswrapper[4762]: I1014 13:10:40.897795 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:40.897913 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:40.897913 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:40.897913 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:40.898509 master-2 kubenswrapper[4762]: I1014 13:10:40.897925 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:41.015906 master-2 kubenswrapper[4762]: I1014 13:10:41.015802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-3-master-2" event={"ID":"ff4898af-987c-42c5-8728-033c5ede3e0f","Type":"ContainerStarted","Data":"1a5565e2e0f073ac96e5a71342aa7bc9343588f6835cc15141df600320b35e4b"} Oct 14 13:10:41.048900 master-2 kubenswrapper[4762]: I1014 13:10:41.048777 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-3-master-2" podStartSLOduration=2.048745165 podStartE2EDuration="2.048745165s" podCreationTimestamp="2025-10-14 13:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:10:41.043681291 +0000 UTC m=+270.287840460" watchObservedRunningTime="2025-10-14 13:10:41.048745165 +0000 UTC m=+270.292904364" Oct 14 13:10:41.293637 master-2 kubenswrapper[4762]: I1014 13:10:41.293526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") pod \"controller-manager-55bcd8787f-4krnt\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:10:41.293953 master-2 kubenswrapper[4762]: E1014 13:10:41.293750 4762 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:10:41.293953 master-2 kubenswrapper[4762]: E1014 13:10:41.293943 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca podName:e3b0f97c-92e0-43c7-a72a-c003f0451347 nodeName:}" failed. No retries permitted until 2025-10-14 13:12:43.293900041 +0000 UTC m=+392.538059240 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca") pod "controller-manager-55bcd8787f-4krnt" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347") : configmap "client-ca" not found Oct 14 13:10:41.898508 master-2 kubenswrapper[4762]: I1014 13:10:41.898412 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:41.898508 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:41.898508 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:41.898508 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:41.899773 master-2 kubenswrapper[4762]: I1014 13:10:41.898523 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:42.899491 master-2 kubenswrapper[4762]: I1014 13:10:42.899385 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:42.899491 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:42.899491 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:42.899491 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:42.899491 master-2 kubenswrapper[4762]: I1014 13:10:42.899484 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:43.899210 master-2 kubenswrapper[4762]: I1014 13:10:43.898597 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:43.899210 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:43.899210 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:43.899210 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:43.899210 master-2 kubenswrapper[4762]: I1014 13:10:43.898697 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: I1014 13:10:43.968334 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:43.969035 master-2 kubenswrapper[4762]: I1014 13:10:43.968422 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:44.898051 master-2 kubenswrapper[4762]: I1014 13:10:44.897961 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:44.898051 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:44.898051 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:44.898051 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:44.898629 master-2 kubenswrapper[4762]: I1014 13:10:44.898065 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:45.425805 master-2 kubenswrapper[4762]: I1014 13:10:45.425725 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:10:45.426532 master-2 kubenswrapper[4762]: I1014 13:10:45.426189 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" containerID="cri-o://21f61bbd0a679861d2b7a35cb7734379d280969386c988ae04b5b4ff4b64d191" gracePeriod=120 Oct 14 13:10:45.427337 master-2 kubenswrapper[4762]: I1014 13:10:45.426342 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://a08f9650be2e2e77d06d19aff6edcc8568dc365457f3253809f222a206d4e2e8" gracePeriod=120 Oct 14 13:10:45.896893 master-2 kubenswrapper[4762]: I1014 13:10:45.896744 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:45.896893 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:45.896893 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:45.896893 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:45.897594 master-2 kubenswrapper[4762]: I1014 13:10:45.897479 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:46.047452 master-2 kubenswrapper[4762]: I1014 13:10:46.047328 4762 generic.go:334] "Generic (PLEG): container finished" podID="3964407d-3235-4331-bee0-0188f908f6c8" containerID="a08f9650be2e2e77d06d19aff6edcc8568dc365457f3253809f222a206d4e2e8" exitCode=0 Oct 14 13:10:46.047452 master-2 kubenswrapper[4762]: I1014 13:10:46.047405 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerDied","Data":"a08f9650be2e2e77d06d19aff6edcc8568dc365457f3253809f222a206d4e2e8"} Oct 14 13:10:46.898610 master-2 kubenswrapper[4762]: I1014 13:10:46.898510 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:46.898610 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:46.898610 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:46.898610 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:46.899785 master-2 kubenswrapper[4762]: I1014 13:10:46.898626 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: I1014 13:10:47.516184 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:47.516299 master-2 kubenswrapper[4762]: I1014 13:10:47.516282 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:47.898071 master-2 kubenswrapper[4762]: I1014 13:10:47.897911 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:47.898071 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:47.898071 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:47.898071 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:47.898071 master-2 kubenswrapper[4762]: I1014 13:10:47.898010 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:48.244737 master-2 kubenswrapper[4762]: I1014 13:10:48.244641 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2c8tn"] Oct 14 13:10:48.245594 master-2 kubenswrapper[4762]: I1014 13:10:48.245521 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.268278 master-2 kubenswrapper[4762]: I1014 13:10:48.268230 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 14 13:10:48.269178 master-2 kubenswrapper[4762]: I1014 13:10:48.268696 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 14 13:10:48.269178 master-2 kubenswrapper[4762]: I1014 13:10:48.268788 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 14 13:10:48.272639 master-2 kubenswrapper[4762]: I1014 13:10:48.272583 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2c8tn"] Oct 14 13:10:48.283101 master-2 kubenswrapper[4762]: I1014 13:10:48.282758 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv88n\" (UniqueName: \"kubernetes.io/projected/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-kube-api-access-cv88n\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.283101 master-2 kubenswrapper[4762]: I1014 13:10:48.282795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.384556 master-2 kubenswrapper[4762]: I1014 13:10:48.384455 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv88n\" (UniqueName: \"kubernetes.io/projected/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-kube-api-access-cv88n\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.384556 master-2 kubenswrapper[4762]: I1014 13:10:48.384550 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.384937 master-2 kubenswrapper[4762]: E1014 13:10:48.384801 4762 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Oct 14 13:10:48.384937 master-2 kubenswrapper[4762]: E1014 13:10:48.384900 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert podName:bacf3bb6-fd05-4a71-943c-522e7e8ce76e nodeName:}" failed. No retries permitted until 2025-10-14 13:10:48.884864639 +0000 UTC m=+278.129023838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert") pod "ingress-canary-2c8tn" (UID: "bacf3bb6-fd05-4a71-943c-522e7e8ce76e") : secret "canary-serving-cert" not found Oct 14 13:10:48.406602 master-2 kubenswrapper[4762]: I1014 13:10:48.406535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv88n\" (UniqueName: \"kubernetes.io/projected/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-kube-api-access-cv88n\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.891692 master-2 kubenswrapper[4762]: I1014 13:10:48.891598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.896769 master-2 kubenswrapper[4762]: I1014 13:10:48.896656 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bacf3bb6-fd05-4a71-943c-522e7e8ce76e-cert\") pod \"ingress-canary-2c8tn\" (UID: \"bacf3bb6-fd05-4a71-943c-522e7e8ce76e\") " pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:48.897649 master-2 kubenswrapper[4762]: I1014 13:10:48.897587 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:48.897649 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:48.897649 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:48.897649 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:48.897913 master-2 kubenswrapper[4762]: I1014 13:10:48.897656 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: I1014 13:10:48.970872 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:48.970997 master-2 kubenswrapper[4762]: I1014 13:10:48.970983 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:49.189729 master-2 kubenswrapper[4762]: I1014 13:10:49.189649 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2c8tn" Oct 14 13:10:49.667224 master-2 kubenswrapper[4762]: I1014 13:10:49.667174 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2c8tn"] Oct 14 13:10:49.676241 master-2 kubenswrapper[4762]: W1014 13:10:49.676176 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbacf3bb6_fd05_4a71_943c_522e7e8ce76e.slice/crio-c02af2ae256e74620250875ce876276740999824db5fc229a0fea2477f653b0f WatchSource:0}: Error finding container c02af2ae256e74620250875ce876276740999824db5fc229a0fea2477f653b0f: Status 404 returned error can't find the container with id c02af2ae256e74620250875ce876276740999824db5fc229a0fea2477f653b0f Oct 14 13:10:49.898228 master-2 kubenswrapper[4762]: I1014 13:10:49.898069 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:49.898228 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:49.898228 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:49.898228 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:49.898228 master-2 kubenswrapper[4762]: I1014 13:10:49.898177 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:50.068867 master-2 kubenswrapper[4762]: I1014 13:10:50.068653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2c8tn" event={"ID":"bacf3bb6-fd05-4a71-943c-522e7e8ce76e","Type":"ContainerStarted","Data":"c02af2ae256e74620250875ce876276740999824db5fc229a0fea2477f653b0f"} Oct 14 13:10:50.898310 master-2 kubenswrapper[4762]: I1014 13:10:50.898138 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:50.898310 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:50.898310 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:50.898310 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:50.898933 master-2 kubenswrapper[4762]: I1014 13:10:50.898344 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:51.898642 master-2 kubenswrapper[4762]: I1014 13:10:51.898492 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:51.898642 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:51.898642 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:51.898642 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:51.898642 master-2 kubenswrapper[4762]: I1014 13:10:51.898638 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:52.084078 master-2 kubenswrapper[4762]: I1014 13:10:52.083864 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2c8tn" event={"ID":"bacf3bb6-fd05-4a71-943c-522e7e8ce76e","Type":"ContainerStarted","Data":"43424ee3f6cd010bfccee69a287aee1d514bfe8337adf449999aa222d0e59eb3"} Oct 14 13:10:52.108978 master-2 kubenswrapper[4762]: I1014 13:10:52.108850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2c8tn" podStartSLOduration=2.084030622 podStartE2EDuration="4.108821331s" podCreationTimestamp="2025-10-14 13:10:48 +0000 UTC" firstStartedPulling="2025-10-14 13:10:49.679885533 +0000 UTC m=+278.924044692" lastFinishedPulling="2025-10-14 13:10:51.704676232 +0000 UTC m=+280.948835401" observedRunningTime="2025-10-14 13:10:52.108680067 +0000 UTC m=+281.352839236" watchObservedRunningTime="2025-10-14 13:10:52.108821331 +0000 UTC m=+281.352980540" Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: I1014 13:10:52.517408 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:52.517501 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:52.518434 master-2 kubenswrapper[4762]: I1014 13:10:52.517521 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:52.897779 master-2 kubenswrapper[4762]: I1014 13:10:52.897562 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:52.897779 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:52.897779 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:52.897779 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:52.897779 master-2 kubenswrapper[4762]: I1014 13:10:52.897668 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:53.897888 master-2 kubenswrapper[4762]: I1014 13:10:53.897765 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:53.897888 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:53.897888 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:53.897888 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:53.897888 master-2 kubenswrapper[4762]: I1014 13:10:53.897864 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: I1014 13:10:53.969938 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:53.970040 master-2 kubenswrapper[4762]: I1014 13:10:53.970013 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:54.897043 master-2 kubenswrapper[4762]: I1014 13:10:54.896935 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:54.897043 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:54.897043 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:54.897043 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:54.897043 master-2 kubenswrapper[4762]: I1014 13:10:54.897035 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:55.351307 master-2 kubenswrapper[4762]: E1014 13:10:55.351133 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" podUID="db8f34cd-ecf2-4682-b1fd-b8e335369cb9" Oct 14 13:10:55.897147 master-2 kubenswrapper[4762]: I1014 13:10:55.897093 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:55.897147 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:55.897147 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:55.897147 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:55.897499 master-2 kubenswrapper[4762]: I1014 13:10:55.897191 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:56.106827 master-2 kubenswrapper[4762]: I1014 13:10:56.106751 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:10:56.897489 master-2 kubenswrapper[4762]: I1014 13:10:56.897361 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:56.897489 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:56.897489 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:56.897489 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:56.897489 master-2 kubenswrapper[4762]: I1014 13:10:56.897475 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: I1014 13:10:57.513747 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:57.513825 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:57.514679 master-2 kubenswrapper[4762]: I1014 13:10:57.513830 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:57.514679 master-2 kubenswrapper[4762]: I1014 13:10:57.513949 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:10:57.897013 master-2 kubenswrapper[4762]: I1014 13:10:57.896889 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:57.897013 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:57.897013 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:57.897013 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:57.897558 master-2 kubenswrapper[4762]: I1014 13:10:57.897516 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:58.897802 master-2 kubenswrapper[4762]: I1014 13:10:58.897697 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:58.897802 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:58.897802 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:58.897802 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:58.897802 master-2 kubenswrapper[4762]: I1014 13:10:58.897800 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: I1014 13:10:58.969444 4762 patch_prober.go:28] interesting pod/apiserver-c57444595-mj7cx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:10:58.969535 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:10:58.970292 master-2 kubenswrapper[4762]: I1014 13:10:58.969530 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:10:59.897504 master-2 kubenswrapper[4762]: I1014 13:10:59.897463 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:10:59.897504 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:10:59.897504 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:10:59.897504 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:10:59.897924 master-2 kubenswrapper[4762]: I1014 13:10:59.897892 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:00.242858 master-2 kubenswrapper[4762]: I1014 13:11:00.242781 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") pod \"route-controller-manager-6f6c689d49-xd4xv\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:11:00.243124 master-2 kubenswrapper[4762]: E1014 13:11:00.242943 4762 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Oct 14 13:11:00.243124 master-2 kubenswrapper[4762]: E1014 13:11:00.243015 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca podName:db8f34cd-ecf2-4682-b1fd-b8e335369cb9 nodeName:}" failed. No retries permitted until 2025-10-14 13:13:02.242991554 +0000 UTC m=+411.487150743 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca") pod "route-controller-manager-6f6c689d49-xd4xv" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9") : configmap "client-ca" not found Oct 14 13:11:00.898660 master-2 kubenswrapper[4762]: I1014 13:11:00.898596 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:00.898660 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:00.898660 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:00.898660 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:00.899778 master-2 kubenswrapper[4762]: I1014 13:11:00.899687 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:01.897859 master-2 kubenswrapper[4762]: I1014 13:11:01.897744 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:01.897859 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:01.897859 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:01.897859 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:01.897859 master-2 kubenswrapper[4762]: I1014 13:11:01.897845 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: I1014 13:11:02.515453 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:02.515508 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:02.516590 master-2 kubenswrapper[4762]: I1014 13:11:02.515576 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:02.861707 master-2 kubenswrapper[4762]: I1014 13:11:02.861649 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-2_7fbbe25e-c819-4f38-a358-ee552afdaa22/installer/0.log" Oct 14 13:11:02.861707 master-2 kubenswrapper[4762]: I1014 13:11:02.861712 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-2" Oct 14 13:11:02.874001 master-2 kubenswrapper[4762]: I1014 13:11:02.872115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir\") pod \"7fbbe25e-c819-4f38-a358-ee552afdaa22\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " Oct 14 13:11:02.874001 master-2 kubenswrapper[4762]: I1014 13:11:02.872355 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access\") pod \"7fbbe25e-c819-4f38-a358-ee552afdaa22\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " Oct 14 13:11:02.874001 master-2 kubenswrapper[4762]: I1014 13:11:02.872433 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock\") pod \"7fbbe25e-c819-4f38-a358-ee552afdaa22\" (UID: \"7fbbe25e-c819-4f38-a358-ee552afdaa22\") " Oct 14 13:11:02.874001 master-2 kubenswrapper[4762]: I1014 13:11:02.873296 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock" (OuterVolumeSpecName: "var-lock") pod "7fbbe25e-c819-4f38-a358-ee552afdaa22" (UID: "7fbbe25e-c819-4f38-a358-ee552afdaa22"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:02.874001 master-2 kubenswrapper[4762]: I1014 13:11:02.873381 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7fbbe25e-c819-4f38-a358-ee552afdaa22" (UID: "7fbbe25e-c819-4f38-a358-ee552afdaa22"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:02.877460 master-2 kubenswrapper[4762]: I1014 13:11:02.877397 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7fbbe25e-c819-4f38-a358-ee552afdaa22" (UID: "7fbbe25e-c819-4f38-a358-ee552afdaa22"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:02.901605 master-2 kubenswrapper[4762]: I1014 13:11:02.901538 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:02.901605 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:02.901605 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:02.901605 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:02.901605 master-2 kubenswrapper[4762]: I1014 13:11:02.901606 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:02.965292 master-2 kubenswrapper[4762]: I1014 13:11:02.965253 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:11:02.973386 master-2 kubenswrapper[4762]: I1014 13:11:02.973343 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973497 master-2 kubenswrapper[4762]: I1014 13:11:02.973391 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973497 master-2 kubenswrapper[4762]: I1014 13:11:02.973416 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ntc\" (UniqueName: \"kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973497 master-2 kubenswrapper[4762]: I1014 13:11:02.973435 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973497 master-2 kubenswrapper[4762]: I1014 13:11:02.973471 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973497 master-2 kubenswrapper[4762]: I1014 13:11:02.973496 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973517 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973536 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca\") pod \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\" (UID: \"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544\") " Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973655 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973666 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fbbe25e-c819-4f38-a358-ee552afdaa22-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973676 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fbbe25e-c819-4f38-a358-ee552afdaa22-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973681 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:02.973790 master-2 kubenswrapper[4762]: I1014 13:11:02.973781 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:02.974249 master-2 kubenswrapper[4762]: I1014 13:11:02.974191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:02.974317 master-2 kubenswrapper[4762]: I1014 13:11:02.974200 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:02.976268 master-2 kubenswrapper[4762]: I1014 13:11:02.976225 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:02.977123 master-2 kubenswrapper[4762]: I1014 13:11:02.977087 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc" (OuterVolumeSpecName: "kube-api-access-j2ntc") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "kube-api-access-j2ntc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:02.978096 master-2 kubenswrapper[4762]: I1014 13:11:02.978045 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:02.978873 master-2 kubenswrapper[4762]: I1014 13:11:02.978825 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" (UID: "2b69dba3-5ac1-4eb9-bba6-0d0662ab8544"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.073999 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074049 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074065 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ntc\" (UniqueName: \"kubernetes.io/projected/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-kube-api-access-j2ntc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074077 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-policies\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074089 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074101 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074112 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.074174 master-2 kubenswrapper[4762]: I1014 13:11:03.074123 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:03.150371 master-2 kubenswrapper[4762]: I1014 13:11:03.150260 4762 generic.go:334] "Generic (PLEG): container finished" podID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerID="03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9" exitCode=0 Oct 14 13:11:03.150371 master-2 kubenswrapper[4762]: I1014 13:11:03.150338 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" Oct 14 13:11:03.150751 master-2 kubenswrapper[4762]: I1014 13:11:03.150333 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" event={"ID":"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544","Type":"ContainerDied","Data":"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9"} Oct 14 13:11:03.150751 master-2 kubenswrapper[4762]: I1014 13:11:03.150569 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-c57444595-mj7cx" event={"ID":"2b69dba3-5ac1-4eb9-bba6-0d0662ab8544","Type":"ContainerDied","Data":"5182c1b2649ee6beacd2c41c0c7f4de643bea40e0a1caaf204ed40774f6e60be"} Oct 14 13:11:03.150751 master-2 kubenswrapper[4762]: I1014 13:11:03.150611 4762 scope.go:117] "RemoveContainer" containerID="03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9" Oct 14 13:11:03.152545 master-2 kubenswrapper[4762]: I1014 13:11:03.152487 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-2_7fbbe25e-c819-4f38-a358-ee552afdaa22/installer/0.log" Oct 14 13:11:03.152545 master-2 kubenswrapper[4762]: I1014 13:11:03.152542 4762 generic.go:334] "Generic (PLEG): container finished" podID="7fbbe25e-c819-4f38-a358-ee552afdaa22" containerID="8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d" exitCode=1 Oct 14 13:11:03.152799 master-2 kubenswrapper[4762]: I1014 13:11:03.152577 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-2" event={"ID":"7fbbe25e-c819-4f38-a358-ee552afdaa22","Type":"ContainerDied","Data":"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d"} Oct 14 13:11:03.152799 master-2 kubenswrapper[4762]: I1014 13:11:03.152608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-2" event={"ID":"7fbbe25e-c819-4f38-a358-ee552afdaa22","Type":"ContainerDied","Data":"b4e0833932ee22e0c27ed1cf773f45a44ea75b33c8c5b04b544cdb27fcf011ef"} Oct 14 13:11:03.152799 master-2 kubenswrapper[4762]: I1014 13:11:03.152633 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-2" Oct 14 13:11:03.174271 master-2 kubenswrapper[4762]: I1014 13:11:03.174211 4762 scope.go:117] "RemoveContainer" containerID="c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1" Oct 14 13:11:03.201401 master-2 kubenswrapper[4762]: I1014 13:11:03.201344 4762 scope.go:117] "RemoveContainer" containerID="03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9" Oct 14 13:11:03.202407 master-2 kubenswrapper[4762]: E1014 13:11:03.202325 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9\": container with ID starting with 03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9 not found: ID does not exist" containerID="03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9" Oct 14 13:11:03.202559 master-2 kubenswrapper[4762]: I1014 13:11:03.202411 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9"} err="failed to get container status \"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9\": rpc error: code = NotFound desc = could not find container \"03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9\": container with ID starting with 03b91a878dfff317795b8e4902ee7208113c7c3d5dd1ae9fb18824d3a8021ed9 not found: ID does not exist" Oct 14 13:11:03.202559 master-2 kubenswrapper[4762]: I1014 13:11:03.202487 4762 scope.go:117] "RemoveContainer" containerID="c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1" Oct 14 13:11:03.202715 master-2 kubenswrapper[4762]: I1014 13:11:03.202598 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:11:03.203259 master-2 kubenswrapper[4762]: E1014 13:11:03.203016 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1\": container with ID starting with c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1 not found: ID does not exist" containerID="c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1" Oct 14 13:11:03.203259 master-2 kubenswrapper[4762]: I1014 13:11:03.203073 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1"} err="failed to get container status \"c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1\": rpc error: code = NotFound desc = could not find container \"c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1\": container with ID starting with c5974296dc0c8f105a9ef3c050f4347d63548eea2095e88af4b07446619ee3d1 not found: ID does not exist" Oct 14 13:11:03.203259 master-2 kubenswrapper[4762]: I1014 13:11:03.203111 4762 scope.go:117] "RemoveContainer" containerID="8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d" Oct 14 13:11:03.209142 master-2 kubenswrapper[4762]: I1014 13:11:03.209107 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-c57444595-mj7cx"] Oct 14 13:11:03.223944 master-2 kubenswrapper[4762]: I1014 13:11:03.223853 4762 scope.go:117] "RemoveContainer" containerID="8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d" Oct 14 13:11:03.224785 master-2 kubenswrapper[4762]: E1014 13:11:03.224713 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d\": container with ID starting with 8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d not found: ID does not exist" containerID="8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d" Oct 14 13:11:03.224943 master-2 kubenswrapper[4762]: I1014 13:11:03.224778 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d"} err="failed to get container status \"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d\": rpc error: code = NotFound desc = could not find container \"8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d\": container with ID starting with 8ee4adc42b688bdabab19fe8b7be0260afa38cac95104e3f8f377739b6e2f52d not found: ID does not exist" Oct 14 13:11:03.228066 master-2 kubenswrapper[4762]: I1014 13:11:03.228005 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:11:03.231053 master-2 kubenswrapper[4762]: I1014 13:11:03.231006 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-2-master-2"] Oct 14 13:11:03.557729 master-2 kubenswrapper[4762]: I1014 13:11:03.557652 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" path="/var/lib/kubelet/pods/2b69dba3-5ac1-4eb9-bba6-0d0662ab8544/volumes" Oct 14 13:11:03.558557 master-2 kubenswrapper[4762]: I1014 13:11:03.558509 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbbe25e-c819-4f38-a358-ee552afdaa22" path="/var/lib/kubelet/pods/7fbbe25e-c819-4f38-a358-ee552afdaa22/volumes" Oct 14 13:11:03.636993 master-2 kubenswrapper[4762]: I1014 13:11:03.636891 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: E1014 13:11:03.637118 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: I1014 13:11:03.637141 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: E1014 13:11:03.637196 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fbbe25e-c819-4f38-a358-ee552afdaa22" containerName="installer" Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: I1014 13:11:03.637215 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fbbe25e-c819-4f38-a358-ee552afdaa22" containerName="installer" Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: E1014 13:11:03.637249 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="fix-audit-permissions" Oct 14 13:11:03.637328 master-2 kubenswrapper[4762]: I1014 13:11:03.637264 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="fix-audit-permissions" Oct 14 13:11:03.637837 master-2 kubenswrapper[4762]: I1014 13:11:03.637460 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fbbe25e-c819-4f38-a358-ee552afdaa22" containerName="installer" Oct 14 13:11:03.637837 master-2 kubenswrapper[4762]: I1014 13:11:03.637482 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b69dba3-5ac1-4eb9-bba6-0d0662ab8544" containerName="oauth-apiserver" Oct 14 13:11:03.638250 master-2 kubenswrapper[4762]: I1014 13:11:03.638207 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.642107 master-2 kubenswrapper[4762]: I1014 13:11:03.641989 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 14 13:11:03.642293 master-2 kubenswrapper[4762]: I1014 13:11:03.642225 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 14 13:11:03.642385 master-2 kubenswrapper[4762]: I1014 13:11:03.642311 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 14 13:11:03.642452 master-2 kubenswrapper[4762]: I1014 13:11:03.642246 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 14 13:11:03.642843 master-2 kubenswrapper[4762]: I1014 13:11:03.642775 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 14 13:11:03.643494 master-2 kubenswrapper[4762]: I1014 13:11:03.643444 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 14 13:11:03.643494 master-2 kubenswrapper[4762]: I1014 13:11:03.643484 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 14 13:11:03.643865 master-2 kubenswrapper[4762]: I1014 13:11:03.643817 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:11:03.650143 master-2 kubenswrapper[4762]: I1014 13:11:03.650085 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:11:03.681300 master-2 kubenswrapper[4762]: I1014 13:11:03.681194 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681582 master-2 kubenswrapper[4762]: I1014 13:11:03.681305 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681582 master-2 kubenswrapper[4762]: I1014 13:11:03.681362 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681582 master-2 kubenswrapper[4762]: I1014 13:11:03.681425 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681582 master-2 kubenswrapper[4762]: I1014 13:11:03.681484 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681943 master-2 kubenswrapper[4762]: I1014 13:11:03.681585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681943 master-2 kubenswrapper[4762]: I1014 13:11:03.681643 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws22v\" (UniqueName: \"kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.681943 master-2 kubenswrapper[4762]: I1014 13:11:03.681782 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.782809 master-2 kubenswrapper[4762]: I1014 13:11:03.782698 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.782809 master-2 kubenswrapper[4762]: I1014 13:11:03.782777 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.782809 master-2 kubenswrapper[4762]: I1014 13:11:03.782825 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.782867 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.782923 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.782962 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws22v\" (UniqueName: \"kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.782972 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.783036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.783366 master-2 kubenswrapper[4762]: I1014 13:11:03.783091 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.784427 master-2 kubenswrapper[4762]: I1014 13:11:03.784348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.784599 master-2 kubenswrapper[4762]: I1014 13:11:03.784546 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.785280 master-2 kubenswrapper[4762]: I1014 13:11:03.785226 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.787251 master-2 kubenswrapper[4762]: I1014 13:11:03.787144 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.787550 master-2 kubenswrapper[4762]: I1014 13:11:03.787486 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.788305 master-2 kubenswrapper[4762]: I1014 13:11:03.788253 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.814615 master-2 kubenswrapper[4762]: I1014 13:11:03.814525 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws22v\" (UniqueName: \"kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v\") pod \"apiserver-96c4c446c-728v2\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:03.899228 master-2 kubenswrapper[4762]: I1014 13:11:03.898987 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:03.899228 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:03.899228 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:03.899228 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:03.899765 master-2 kubenswrapper[4762]: I1014 13:11:03.899128 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:03.961412 master-2 kubenswrapper[4762]: I1014 13:11:03.961260 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:04.382679 master-2 kubenswrapper[4762]: I1014 13:11:04.382592 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:11:04.390469 master-2 kubenswrapper[4762]: W1014 13:11:04.390392 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdac5df3_de02_49f0_8b90_53464ca0b6dd.slice/crio-0b85ff8b7ef111fd93a34eb70a4acc4cd875cffaccda2f7bde3cde5efda1c05e WatchSource:0}: Error finding container 0b85ff8b7ef111fd93a34eb70a4acc4cd875cffaccda2f7bde3cde5efda1c05e: Status 404 returned error can't find the container with id 0b85ff8b7ef111fd93a34eb70a4acc4cd875cffaccda2f7bde3cde5efda1c05e Oct 14 13:11:04.897657 master-2 kubenswrapper[4762]: I1014 13:11:04.897550 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:04.897657 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:04.897657 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:04.897657 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:04.897657 master-2 kubenswrapper[4762]: I1014 13:11:04.897640 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:05.168244 master-2 kubenswrapper[4762]: I1014 13:11:05.168133 4762 generic.go:334] "Generic (PLEG): container finished" podID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerID="cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b" exitCode=0 Oct 14 13:11:05.168244 master-2 kubenswrapper[4762]: I1014 13:11:05.168206 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" event={"ID":"fdac5df3-de02-49f0-8b90-53464ca0b6dd","Type":"ContainerDied","Data":"cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b"} Oct 14 13:11:05.168244 master-2 kubenswrapper[4762]: I1014 13:11:05.168239 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" event={"ID":"fdac5df3-de02-49f0-8b90-53464ca0b6dd","Type":"ContainerStarted","Data":"0b85ff8b7ef111fd93a34eb70a4acc4cd875cffaccda2f7bde3cde5efda1c05e"} Oct 14 13:11:05.897490 master-2 kubenswrapper[4762]: I1014 13:11:05.897221 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:05.897490 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:05.897490 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:05.897490 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:05.897490 master-2 kubenswrapper[4762]: I1014 13:11:05.897302 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:06.180863 master-2 kubenswrapper[4762]: I1014 13:11:06.180764 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" event={"ID":"fdac5df3-de02-49f0-8b90-53464ca0b6dd","Type":"ContainerStarted","Data":"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385"} Oct 14 13:11:06.205715 master-2 kubenswrapper[4762]: I1014 13:11:06.205599 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podStartSLOduration=55.205567869 podStartE2EDuration="55.205567869s" podCreationTimestamp="2025-10-14 13:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:11:06.203920336 +0000 UTC m=+295.448079535" watchObservedRunningTime="2025-10-14 13:11:06.205567869 +0000 UTC m=+295.449727068" Oct 14 13:11:06.898052 master-2 kubenswrapper[4762]: I1014 13:11:06.897949 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:06.898052 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:06.898052 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:06.898052 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:06.898052 master-2 kubenswrapper[4762]: I1014 13:11:06.898029 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: I1014 13:11:07.516257 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:07.516327 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:07.517991 master-2 kubenswrapper[4762]: I1014 13:11:07.516343 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:07.898691 master-2 kubenswrapper[4762]: I1014 13:11:07.897896 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:07.898691 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:07.898691 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:07.898691 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:07.898691 master-2 kubenswrapper[4762]: I1014 13:11:07.898615 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:08.901518 master-2 kubenswrapper[4762]: I1014 13:11:08.901427 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:08.901518 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:08.901518 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:08.901518 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:08.902095 master-2 kubenswrapper[4762]: I1014 13:11:08.901530 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:08.962607 master-2 kubenswrapper[4762]: I1014 13:11:08.962494 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:08.962607 master-2 kubenswrapper[4762]: I1014 13:11:08.962607 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:08.975459 master-2 kubenswrapper[4762]: I1014 13:11:08.975353 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:09.207088 master-2 kubenswrapper[4762]: I1014 13:11:09.207047 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:11:09.898855 master-2 kubenswrapper[4762]: I1014 13:11:09.898714 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:09.898855 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:09.898855 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:09.898855 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:09.898855 master-2 kubenswrapper[4762]: I1014 13:11:09.898826 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:10.117198 master-2 kubenswrapper[4762]: I1014 13:11:10.117103 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:11:10.898298 master-2 kubenswrapper[4762]: I1014 13:11:10.898110 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:10.898298 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:10.898298 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:10.898298 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:10.898298 master-2 kubenswrapper[4762]: I1014 13:11:10.898277 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:11.414676 master-2 kubenswrapper[4762]: I1014 13:11:11.414620 4762 kubelet.go:1505] "Image garbage collection succeeded" Oct 14 13:11:11.898307 master-2 kubenswrapper[4762]: I1014 13:11:11.897990 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:11.898307 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:11.898307 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:11.898307 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:11.898307 master-2 kubenswrapper[4762]: I1014 13:11:11.898110 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: I1014 13:11:12.515542 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:12.515660 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:12.516783 master-2 kubenswrapper[4762]: I1014 13:11:12.515673 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:12.897179 master-2 kubenswrapper[4762]: I1014 13:11:12.896996 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:12.897179 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:12.897179 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:12.897179 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:12.897179 master-2 kubenswrapper[4762]: I1014 13:11:12.897079 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:13.898257 master-2 kubenswrapper[4762]: I1014 13:11:13.897235 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:13.898257 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:13.898257 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:13.898257 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:13.898257 master-2 kubenswrapper[4762]: I1014 13:11:13.897331 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:14.897685 master-2 kubenswrapper[4762]: I1014 13:11:14.897520 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:14.897685 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:14.897685 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:14.897685 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:14.897685 master-2 kubenswrapper[4762]: I1014 13:11:14.897619 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:15.898724 master-2 kubenswrapper[4762]: I1014 13:11:15.898397 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:15.898724 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:15.898724 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:15.898724 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:15.898724 master-2 kubenswrapper[4762]: I1014 13:11:15.898463 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:16.896653 master-2 kubenswrapper[4762]: I1014 13:11:16.896570 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:16.896653 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:16.896653 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:16.896653 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:16.897059 master-2 kubenswrapper[4762]: I1014 13:11:16.896662 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: I1014 13:11:17.515802 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:17.515855 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:17.517709 master-2 kubenswrapper[4762]: I1014 13:11:17.516508 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: I1014 13:11:17.897539 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: I1014 13:11:17.897614 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:17.897742 master-2 kubenswrapper[4762]: I1014 13:11:17.897681 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:11:17.898510 master-2 kubenswrapper[4762]: I1014 13:11:17.898468 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047"} pod="openshift-ingress/router-default-5ddb89f76-887cs" containerMessage="Container router failed startup probe, will be restarted" Oct 14 13:11:17.898624 master-2 kubenswrapper[4762]: I1014 13:11:17.898532 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" containerID="cri-o://55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047" gracePeriod=3600 Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: I1014 13:11:22.516107 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:22.516280 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:22.518336 master-2 kubenswrapper[4762]: I1014 13:11:22.516293 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: I1014 13:11:27.514627 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:27.514695 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:27.515636 master-2 kubenswrapper[4762]: I1014 13:11:27.514714 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: I1014 13:11:32.517442 4762 patch_prober.go:28] interesting pod/apiserver-6576f6bc9d-r2fhv container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:11:32.517509 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:11:32.518876 master-2 kubenswrapper[4762]: I1014 13:11:32.517515 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:11:37.357820 master-2 kubenswrapper[4762]: I1014 13:11:37.357533 4762 generic.go:334] "Generic (PLEG): container finished" podID="3964407d-3235-4331-bee0-0188f908f6c8" containerID="21f61bbd0a679861d2b7a35cb7734379d280969386c988ae04b5b4ff4b64d191" exitCode=0 Oct 14 13:11:37.357820 master-2 kubenswrapper[4762]: I1014 13:11:37.357592 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerDied","Data":"21f61bbd0a679861d2b7a35cb7734379d280969386c988ae04b5b4ff4b64d191"} Oct 14 13:11:37.426608 master-2 kubenswrapper[4762]: I1014 13:11:37.426566 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607781 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607844 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607888 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607915 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607933 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607911 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607958 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2jcg\" (UniqueName: \"kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.607979 master-2 kubenswrapper[4762]: I1014 13:11:37.607987 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608041 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608106 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608123 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets\") pod \"3964407d-3235-4331-bee0-0188f908f6c8\" (UID: \"3964407d-3235-4331-bee0-0188f908f6c8\") " Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608281 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608317 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:37.608571 master-2 kubenswrapper[4762]: I1014 13:11:37.608533 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:37.609039 master-2 kubenswrapper[4762]: I1014 13:11:37.608999 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:37.609147 master-2 kubenswrapper[4762]: I1014 13:11:37.609087 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:37.609524 master-2 kubenswrapper[4762]: I1014 13:11:37.609400 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit" (OuterVolumeSpecName: "audit") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:37.609524 master-2 kubenswrapper[4762]: I1014 13:11:37.609479 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config" (OuterVolumeSpecName: "config") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:11:37.612042 master-2 kubenswrapper[4762]: I1014 13:11:37.611980 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:37.612148 master-2 kubenswrapper[4762]: I1014 13:11:37.611992 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:37.612148 master-2 kubenswrapper[4762]: I1014 13:11:37.612101 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:11:37.612265 master-2 kubenswrapper[4762]: I1014 13:11:37.612230 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg" (OuterVolumeSpecName: "kube-api-access-f2jcg") pod "3964407d-3235-4331-bee0-0188f908f6c8" (UID: "3964407d-3235-4331-bee0-0188f908f6c8"). InnerVolumeSpecName "kube-api-access-f2jcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708795 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708839 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708849 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708858 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2jcg\" (UniqueName: \"kubernetes.io/projected/3964407d-3235-4331-bee0-0188f908f6c8-kube-api-access-f2jcg\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708868 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708876 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3964407d-3235-4331-bee0-0188f908f6c8-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708884 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-audit\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708891 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-image-import-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708898 4762 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3964407d-3235-4331-bee0-0188f908f6c8-node-pullsecrets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:37.708871 master-2 kubenswrapper[4762]: I1014 13:11:37.708906 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3964407d-3235-4331-bee0-0188f908f6c8-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:38.363922 master-2 kubenswrapper[4762]: I1014 13:11:38.363850 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" event={"ID":"3964407d-3235-4331-bee0-0188f908f6c8","Type":"ContainerDied","Data":"716c3ce129ce2932aea7c7fb0982af4766d73a0a044009d44c9d5ac1ac0033e5"} Oct 14 13:11:38.363922 master-2 kubenswrapper[4762]: I1014 13:11:38.363908 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6576f6bc9d-r2fhv" Oct 14 13:11:38.363922 master-2 kubenswrapper[4762]: I1014 13:11:38.363923 4762 scope.go:117] "RemoveContainer" containerID="a08f9650be2e2e77d06d19aff6edcc8568dc365457f3253809f222a206d4e2e8" Oct 14 13:11:38.377209 master-2 kubenswrapper[4762]: I1014 13:11:38.377169 4762 scope.go:117] "RemoveContainer" containerID="21f61bbd0a679861d2b7a35cb7734379d280969386c988ae04b5b4ff4b64d191" Oct 14 13:11:38.391409 master-2 kubenswrapper[4762]: I1014 13:11:38.391374 4762 scope.go:117] "RemoveContainer" containerID="a3124751acf39ab26e6f4f85e9d803d6e71bc252c303566e819e99a1f9bf1afc" Oct 14 13:11:38.404394 master-2 kubenswrapper[4762]: I1014 13:11:38.404338 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:11:38.412127 master-2 kubenswrapper[4762]: I1014 13:11:38.412082 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-6576f6bc9d-r2fhv"] Oct 14 13:11:39.556230 master-2 kubenswrapper[4762]: I1014 13:11:39.556137 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3964407d-3235-4331-bee0-0188f908f6c8" path="/var/lib/kubelet/pods/3964407d-3235-4331-bee0-0188f908f6c8/volumes" Oct 14 13:11:40.110249 master-2 kubenswrapper[4762]: I1014 13:11:40.110175 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:11:42.077656 master-2 kubenswrapper[4762]: I1014 13:11:42.077395 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:11:42.077656 master-2 kubenswrapper[4762]: E1014 13:11:42.077615 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" Oct 14 13:11:42.077656 master-2 kubenswrapper[4762]: I1014 13:11:42.077633 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" Oct 14 13:11:42.077656 master-2 kubenswrapper[4762]: E1014 13:11:42.077656 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="fix-audit-permissions" Oct 14 13:11:42.077656 master-2 kubenswrapper[4762]: I1014 13:11:42.077667 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="fix-audit-permissions" Oct 14 13:11:42.078536 master-2 kubenswrapper[4762]: E1014 13:11:42.077681 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver-check-endpoints" Oct 14 13:11:42.078536 master-2 kubenswrapper[4762]: I1014 13:11:42.077695 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver-check-endpoints" Oct 14 13:11:42.078536 master-2 kubenswrapper[4762]: I1014 13:11:42.077800 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver" Oct 14 13:11:42.078536 master-2 kubenswrapper[4762]: I1014 13:11:42.077819 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3964407d-3235-4331-bee0-0188f908f6c8" containerName="openshift-apiserver-check-endpoints" Oct 14 13:11:42.079582 master-2 kubenswrapper[4762]: I1014 13:11:42.079536 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.158904 master-2 kubenswrapper[4762]: I1014 13:11:42.158805 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.158904 master-2 kubenswrapper[4762]: I1014 13:11:42.158882 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.158904 master-2 kubenswrapper[4762]: I1014 13:11:42.158912 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.159420 master-2 kubenswrapper[4762]: I1014 13:11:42.158972 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.159420 master-2 kubenswrapper[4762]: I1014 13:11:42.159024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.159420 master-2 kubenswrapper[4762]: I1014 13:11:42.159062 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.189090 master-2 kubenswrapper[4762]: I1014 13:11:42.189032 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:11:42.259663 master-2 kubenswrapper[4762]: I1014 13:11:42.259592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259663 master-2 kubenswrapper[4762]: I1014 13:11:42.259649 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259697 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259730 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259735 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259837 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259857 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.259934 master-2 kubenswrapper[4762]: I1014 13:11:42.259906 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.260240 master-2 kubenswrapper[4762]: I1014 13:11:42.259968 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.260240 master-2 kubenswrapper[4762]: I1014 13:11:42.260020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.260240 master-2 kubenswrapper[4762]: I1014 13:11:42.260081 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir\") pod \"etcd-master-2\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.386600 master-2 kubenswrapper[4762]: I1014 13:11:42.386427 4762 generic.go:334] "Generic (PLEG): container finished" podID="ff4898af-987c-42c5-8728-033c5ede3e0f" containerID="1a5565e2e0f073ac96e5a71342aa7bc9343588f6835cc15141df600320b35e4b" exitCode=0 Oct 14 13:11:42.386904 master-2 kubenswrapper[4762]: I1014 13:11:42.386539 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-3-master-2" event={"ID":"ff4898af-987c-42c5-8728-033c5ede3e0f","Type":"ContainerDied","Data":"1a5565e2e0f073ac96e5a71342aa7bc9343588f6835cc15141df600320b35e4b"} Oct 14 13:11:42.487375 master-2 kubenswrapper[4762]: I1014 13:11:42.487310 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:42.506244 master-2 kubenswrapper[4762]: I1014 13:11:42.506204 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:11:43.392484 master-2 kubenswrapper[4762]: I1014 13:11:43.392430 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"0829f1a6ff8f1ddef0afe80e9814edf14a97b3145942491b3595b03672bb9649"} Oct 14 13:11:43.656839 master-2 kubenswrapper[4762]: I1014 13:11:43.656720 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-3-master-2" Oct 14 13:11:43.777887 master-2 kubenswrapper[4762]: I1014 13:11:43.777804 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access\") pod \"ff4898af-987c-42c5-8728-033c5ede3e0f\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " Oct 14 13:11:43.777887 master-2 kubenswrapper[4762]: I1014 13:11:43.777884 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock\") pod \"ff4898af-987c-42c5-8728-033c5ede3e0f\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " Oct 14 13:11:43.778306 master-2 kubenswrapper[4762]: I1014 13:11:43.777982 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir\") pod \"ff4898af-987c-42c5-8728-033c5ede3e0f\" (UID: \"ff4898af-987c-42c5-8728-033c5ede3e0f\") " Oct 14 13:11:43.778306 master-2 kubenswrapper[4762]: I1014 13:11:43.778264 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff4898af-987c-42c5-8728-033c5ede3e0f" (UID: "ff4898af-987c-42c5-8728-033c5ede3e0f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:43.778306 master-2 kubenswrapper[4762]: I1014 13:11:43.778294 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock" (OuterVolumeSpecName: "var-lock") pod "ff4898af-987c-42c5-8728-033c5ede3e0f" (UID: "ff4898af-987c-42c5-8728-033c5ede3e0f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:11:43.781838 master-2 kubenswrapper[4762]: I1014 13:11:43.781768 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff4898af-987c-42c5-8728-033c5ede3e0f" (UID: "ff4898af-987c-42c5-8728-033c5ede3e0f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:11:43.879645 master-2 kubenswrapper[4762]: I1014 13:11:43.879585 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:43.879645 master-2 kubenswrapper[4762]: I1014 13:11:43.879642 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff4898af-987c-42c5-8728-033c5ede3e0f-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:43.880077 master-2 kubenswrapper[4762]: I1014 13:11:43.879661 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff4898af-987c-42c5-8728-033c5ede3e0f-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:11:44.399522 master-2 kubenswrapper[4762]: I1014 13:11:44.399438 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-3-master-2" event={"ID":"ff4898af-987c-42c5-8728-033c5ede3e0f","Type":"ContainerDied","Data":"2f6b446fab6e1dd8dac5d81ac83c005e16bc5d51cbc00adf1b9666244b0520dc"} Oct 14 13:11:44.399522 master-2 kubenswrapper[4762]: I1014 13:11:44.399513 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6b446fab6e1dd8dac5d81ac83c005e16bc5d51cbc00adf1b9666244b0520dc" Oct 14 13:11:44.399522 master-2 kubenswrapper[4762]: I1014 13:11:44.399521 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-3-master-2" Oct 14 13:11:44.661388 master-2 kubenswrapper[4762]: I1014 13:11:44.661324 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:11:44.661718 master-2 kubenswrapper[4762]: E1014 13:11:44.661489 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4898af-987c-42c5-8728-033c5ede3e0f" containerName="installer" Oct 14 13:11:44.661718 master-2 kubenswrapper[4762]: I1014 13:11:44.661502 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4898af-987c-42c5-8728-033c5ede3e0f" containerName="installer" Oct 14 13:11:44.661718 master-2 kubenswrapper[4762]: I1014 13:11:44.661567 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4898af-987c-42c5-8728-033c5ede3e0f" containerName="installer" Oct 14 13:11:44.662178 master-2 kubenswrapper[4762]: I1014 13:11:44.662114 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.665287 master-2 kubenswrapper[4762]: I1014 13:11:44.665254 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:11:44.665922 master-2 kubenswrapper[4762]: I1014 13:11:44.665871 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:11:44.666202 master-2 kubenswrapper[4762]: I1014 13:11:44.666106 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 14 13:11:44.666552 master-2 kubenswrapper[4762]: I1014 13:11:44.666519 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:11:44.667391 master-2 kubenswrapper[4762]: I1014 13:11:44.667361 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:11:44.667609 master-2 kubenswrapper[4762]: I1014 13:11:44.667583 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:11:44.667660 master-2 kubenswrapper[4762]: I1014 13:11:44.667620 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:11:44.667699 master-2 kubenswrapper[4762]: I1014 13:11:44.667586 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:11:44.668052 master-2 kubenswrapper[4762]: I1014 13:11:44.667795 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:11:44.677850 master-2 kubenswrapper[4762]: I1014 13:11:44.677798 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:11:44.688604 master-2 kubenswrapper[4762]: I1014 13:11:44.688525 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688604 master-2 kubenswrapper[4762]: I1014 13:11:44.688606 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688653 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688692 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688747 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688785 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhtgk\" (UniqueName: \"kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.688959 master-2 kubenswrapper[4762]: I1014 13:11:44.688952 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:11:44.689620 master-2 kubenswrapper[4762]: I1014 13:11:44.688984 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.689620 master-2 kubenswrapper[4762]: I1014 13:11:44.689024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.689620 master-2 kubenswrapper[4762]: I1014 13:11:44.689128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.789818 master-2 kubenswrapper[4762]: I1014 13:11:44.789754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.789818 master-2 kubenswrapper[4762]: I1014 13:11:44.789818 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789855 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789903 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789922 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789948 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhtgk\" (UniqueName: \"kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789966 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.789983 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.790000 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790106 master-2 kubenswrapper[4762]: I1014 13:11:44.790020 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.790720 master-2 kubenswrapper[4762]: I1014 13:11:44.790682 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.791029 master-2 kubenswrapper[4762]: I1014 13:11:44.790995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.791435 master-2 kubenswrapper[4762]: I1014 13:11:44.791408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.791478 master-2 kubenswrapper[4762]: I1014 13:11:44.791460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.792469 master-2 kubenswrapper[4762]: I1014 13:11:44.792421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.792633 master-2 kubenswrapper[4762]: I1014 13:11:44.792586 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.792698 master-2 kubenswrapper[4762]: I1014 13:11:44.792636 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.794366 master-2 kubenswrapper[4762]: I1014 13:11:44.794324 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.795086 master-2 kubenswrapper[4762]: I1014 13:11:44.795021 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.799291 master-2 kubenswrapper[4762]: I1014 13:11:44.799253 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.822848 master-2 kubenswrapper[4762]: I1014 13:11:44.822796 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhtgk\" (UniqueName: \"kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk\") pod \"apiserver-8644c46667-cg62m\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:44.976450 master-2 kubenswrapper[4762]: I1014 13:11:44.976420 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:45.376899 master-2 kubenswrapper[4762]: I1014 13:11:45.376698 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:11:45.381577 master-2 kubenswrapper[4762]: W1014 13:11:45.381519 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c635b4_3d81_46f5_8f71_18a213b49c55.slice/crio-39ae4750077f04a8863b9c67437d8ec63dca0f595765405514e0bf82ac1eb173 WatchSource:0}: Error finding container 39ae4750077f04a8863b9c67437d8ec63dca0f595765405514e0bf82ac1eb173: Status 404 returned error can't find the container with id 39ae4750077f04a8863b9c67437d8ec63dca0f595765405514e0bf82ac1eb173 Oct 14 13:11:45.411013 master-2 kubenswrapper[4762]: I1014 13:11:45.410912 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8" exitCode=0 Oct 14 13:11:45.411013 master-2 kubenswrapper[4762]: I1014 13:11:45.411016 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerDied","Data":"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8"} Oct 14 13:11:45.412779 master-2 kubenswrapper[4762]: I1014 13:11:45.412718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerStarted","Data":"39ae4750077f04a8863b9c67437d8ec63dca0f595765405514e0bf82ac1eb173"} Oct 14 13:11:46.422126 master-2 kubenswrapper[4762]: I1014 13:11:46.422002 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d" exitCode=0 Oct 14 13:11:46.422126 master-2 kubenswrapper[4762]: I1014 13:11:46.422101 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerDied","Data":"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d"} Oct 14 13:11:46.424688 master-2 kubenswrapper[4762]: I1014 13:11:46.424634 4762 generic.go:334] "Generic (PLEG): container finished" podID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerID="23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa" exitCode=0 Oct 14 13:11:46.424688 master-2 kubenswrapper[4762]: I1014 13:11:46.424682 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerDied","Data":"23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa"} Oct 14 13:11:47.442187 master-2 kubenswrapper[4762]: I1014 13:11:47.442094 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33" exitCode=0 Oct 14 13:11:47.442688 master-2 kubenswrapper[4762]: I1014 13:11:47.442213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerDied","Data":"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33"} Oct 14 13:11:47.445276 master-2 kubenswrapper[4762]: I1014 13:11:47.445244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerStarted","Data":"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499"} Oct 14 13:11:47.445377 master-2 kubenswrapper[4762]: I1014 13:11:47.445275 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerStarted","Data":"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af"} Oct 14 13:11:47.514358 master-2 kubenswrapper[4762]: I1014 13:11:47.514224 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podStartSLOduration=62.514194755 podStartE2EDuration="1m2.514194755s" podCreationTimestamp="2025-10-14 13:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:11:47.513591765 +0000 UTC m=+336.757750934" watchObservedRunningTime="2025-10-14 13:11:47.514194755 +0000 UTC m=+336.758353944" Oct 14 13:11:48.453147 master-2 kubenswrapper[4762]: I1014 13:11:48.453088 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/0.log" Oct 14 13:11:48.455490 master-2 kubenswrapper[4762]: I1014 13:11:48.455428 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="6b3603daa7632a9ea313dd152c48fef9c192e86466ff5748c6d43fbcd40d9a7e" exitCode=1 Oct 14 13:11:48.455644 master-2 kubenswrapper[4762]: I1014 13:11:48.455604 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951"} Oct 14 13:11:48.455747 master-2 kubenswrapper[4762]: I1014 13:11:48.455731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerDied","Data":"6b3603daa7632a9ea313dd152c48fef9c192e86466ff5748c6d43fbcd40d9a7e"} Oct 14 13:11:48.455834 master-2 kubenswrapper[4762]: I1014 13:11:48.455820 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e"} Oct 14 13:11:49.468149 master-2 kubenswrapper[4762]: I1014 13:11:49.468054 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/0.log" Oct 14 13:11:49.470504 master-2 kubenswrapper[4762]: I1014 13:11:49.470432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546"} Oct 14 13:11:49.470504 master-2 kubenswrapper[4762]: I1014 13:11:49.470499 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6"} Oct 14 13:11:49.471332 master-2 kubenswrapper[4762]: I1014 13:11:49.471268 4762 scope.go:117] "RemoveContainer" containerID="6b3603daa7632a9ea313dd152c48fef9c192e86466ff5748c6d43fbcd40d9a7e" Oct 14 13:11:49.976795 master-2 kubenswrapper[4762]: I1014 13:11:49.976626 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:49.976795 master-2 kubenswrapper[4762]: I1014 13:11:49.976719 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:49.989178 master-2 kubenswrapper[4762]: I1014 13:11:49.989096 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:50.010473 master-2 kubenswrapper[4762]: I1014 13:11:50.010379 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-guard-master-2"] Oct 14 13:11:50.011246 master-2 kubenswrapper[4762]: I1014 13:11:50.011209 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:50.014801 master-2 kubenswrapper[4762]: I1014 13:11:50.014741 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"openshift-service-ca.crt" Oct 14 13:11:50.015298 master-2 kubenswrapper[4762]: I1014 13:11:50.015229 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Oct 14 13:11:50.026812 master-2 kubenswrapper[4762]: I1014 13:11:50.026746 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-2"] Oct 14 13:11:50.158379 master-2 kubenswrapper[4762]: I1014 13:11:50.158249 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rgz\" (UniqueName: \"kubernetes.io/projected/1b6a1dbe-f753-4c92-8b36-47517010f2f3-kube-api-access-t9rgz\") pod \"etcd-guard-master-2\" (UID: \"1b6a1dbe-f753-4c92-8b36-47517010f2f3\") " pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:50.260273 master-2 kubenswrapper[4762]: I1014 13:11:50.260033 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rgz\" (UniqueName: \"kubernetes.io/projected/1b6a1dbe-f753-4c92-8b36-47517010f2f3-kube-api-access-t9rgz\") pod \"etcd-guard-master-2\" (UID: \"1b6a1dbe-f753-4c92-8b36-47517010f2f3\") " pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:50.287737 master-2 kubenswrapper[4762]: I1014 13:11:50.287636 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rgz\" (UniqueName: \"kubernetes.io/projected/1b6a1dbe-f753-4c92-8b36-47517010f2f3-kube-api-access-t9rgz\") pod \"etcd-guard-master-2\" (UID: \"1b6a1dbe-f753-4c92-8b36-47517010f2f3\") " pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:50.333991 master-2 kubenswrapper[4762]: I1014 13:11:50.333928 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:50.488807 master-2 kubenswrapper[4762]: I1014 13:11:50.488749 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:11:50.491144 master-2 kubenswrapper[4762]: I1014 13:11:50.491083 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/0.log" Oct 14 13:11:50.492995 master-2 kubenswrapper[4762]: I1014 13:11:50.492910 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" exitCode=1 Oct 14 13:11:50.493102 master-2 kubenswrapper[4762]: I1014 13:11:50.493024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerDied","Data":"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410"} Oct 14 13:11:50.493102 master-2 kubenswrapper[4762]: I1014 13:11:50.493068 4762 scope.go:117] "RemoveContainer" containerID="6b3603daa7632a9ea313dd152c48fef9c192e86466ff5748c6d43fbcd40d9a7e" Oct 14 13:11:50.494130 master-2 kubenswrapper[4762]: I1014 13:11:50.494070 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:11:50.494698 master-2 kubenswrapper[4762]: E1014 13:11:50.494629 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-2_openshift-etcd(c492168afa20f49cb6e3534e1871011b)\"" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" Oct 14 13:11:50.500030 master-2 kubenswrapper[4762]: I1014 13:11:50.499982 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:11:50.760636 master-2 kubenswrapper[4762]: I1014 13:11:50.760552 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-2"] Oct 14 13:11:50.770425 master-2 kubenswrapper[4762]: W1014 13:11:50.770287 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6a1dbe_f753_4c92_8b36_47517010f2f3.slice/crio-a5683565c0341b90655248ba9ffa059ffd88bedd5b04359f5398862318291db6 WatchSource:0}: Error finding container a5683565c0341b90655248ba9ffa059ffd88bedd5b04359f5398862318291db6: Status 404 returned error can't find the container with id a5683565c0341b90655248ba9ffa059ffd88bedd5b04359f5398862318291db6 Oct 14 13:11:51.504400 master-2 kubenswrapper[4762]: I1014 13:11:51.504331 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-guard-master-2" event={"ID":"1b6a1dbe-f753-4c92-8b36-47517010f2f3","Type":"ContainerStarted","Data":"cea1450406caeed3fd3fc9e4b7aa382efb496556c38b8b318f79631285dd811d"} Oct 14 13:11:51.504400 master-2 kubenswrapper[4762]: I1014 13:11:51.504399 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-guard-master-2" event={"ID":"1b6a1dbe-f753-4c92-8b36-47517010f2f3","Type":"ContainerStarted","Data":"a5683565c0341b90655248ba9ffa059ffd88bedd5b04359f5398862318291db6"} Oct 14 13:11:51.505194 master-2 kubenswrapper[4762]: I1014 13:11:51.504431 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:11:51.508072 master-2 kubenswrapper[4762]: I1014 13:11:51.508007 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:11:51.514269 master-2 kubenswrapper[4762]: I1014 13:11:51.514218 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:11:51.515714 master-2 kubenswrapper[4762]: E1014 13:11:51.515656 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-2_openshift-etcd(c492168afa20f49cb6e3534e1871011b)\"" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" Oct 14 13:11:51.526642 master-2 kubenswrapper[4762]: I1014 13:11:51.526550 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-guard-master-2" podStartSLOduration=1.5265231190000002 podStartE2EDuration="1.526523119s" podCreationTimestamp="2025-10-14 13:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:11:51.524261607 +0000 UTC m=+340.768420856" watchObservedRunningTime="2025-10-14 13:11:51.526523119 +0000 UTC m=+340.770682278" Oct 14 13:11:52.488424 master-2 kubenswrapper[4762]: I1014 13:11:52.488329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:52.488424 master-2 kubenswrapper[4762]: I1014 13:11:52.488424 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:52.488424 master-2 kubenswrapper[4762]: I1014 13:11:52.488440 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:52.488843 master-2 kubenswrapper[4762]: I1014 13:11:52.488456 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:11:52.522682 master-2 kubenswrapper[4762]: I1014 13:11:52.522634 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:11:52.523968 master-2 kubenswrapper[4762]: E1014 13:11:52.523895 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd pod=etcd-master-2_openshift-etcd(c492168afa20f49cb6e3534e1871011b)\"" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" Oct 14 13:11:56.505026 master-2 kubenswrapper[4762]: I1014 13:11:56.504888 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:11:56.505026 master-2 kubenswrapper[4762]: I1014 13:11:56.505004 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:11:58.390570 master-2 kubenswrapper[4762]: I1014 13:11:58.390500 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-2"] Oct 14 13:11:58.391433 master-2 kubenswrapper[4762]: I1014 13:11:58.391402 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.395041 master-2 kubenswrapper[4762]: I1014 13:11:58.394953 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 14 13:11:58.408100 master-2 kubenswrapper[4762]: I1014 13:11:58.408008 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-2"] Oct 14 13:11:58.565634 master-2 kubenswrapper[4762]: I1014 13:11:58.565491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.565634 master-2 kubenswrapper[4762]: I1014 13:11:58.565559 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.565634 master-2 kubenswrapper[4762]: I1014 13:11:58.565643 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.666881 master-2 kubenswrapper[4762]: I1014 13:11:58.666733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.666881 master-2 kubenswrapper[4762]: I1014 13:11:58.666837 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.666881 master-2 kubenswrapper[4762]: I1014 13:11:58.666889 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.667502 master-2 kubenswrapper[4762]: I1014 13:11:58.667017 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.667502 master-2 kubenswrapper[4762]: I1014 13:11:58.667216 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.700488 master-2 kubenswrapper[4762]: I1014 13:11:58.700381 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access\") pod \"installer-4-master-2\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:58.745636 master-2 kubenswrapper[4762]: I1014 13:11:58.745490 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:11:59.185665 master-2 kubenswrapper[4762]: I1014 13:11:59.185426 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-2"] Oct 14 13:11:59.558949 master-2 kubenswrapper[4762]: I1014 13:11:59.558795 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-2" event={"ID":"df774fb4-463b-43ef-bdff-0525e6ca4c1a","Type":"ContainerStarted","Data":"98c42eb613c3fff56176f4ee2e5cf5c4251c13cd317be48027a636a3ed9b48e8"} Oct 14 13:12:00.010072 master-2 kubenswrapper[4762]: I1014 13:12:00.009969 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/etcd-guard-master-2"] Oct 14 13:12:00.010444 master-2 kubenswrapper[4762]: I1014 13:12:00.010123 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:12:01.138065 master-2 kubenswrapper[4762]: I1014 13:12:01.138003 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55bcd8787f-4krnt"] Oct 14 13:12:01.138738 master-2 kubenswrapper[4762]: E1014 13:12:01.138565 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" podUID="e3b0f97c-92e0-43c7-a72a-c003f0451347" Oct 14 13:12:01.165251 master-2 kubenswrapper[4762]: I1014 13:12:01.164516 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv"] Oct 14 13:12:01.165251 master-2 kubenswrapper[4762]: E1014 13:12:01.165187 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" podUID="db8f34cd-ecf2-4682-b1fd-b8e335369cb9" Oct 14 13:12:01.505544 master-2 kubenswrapper[4762]: I1014 13:12:01.505503 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:12:01.505781 master-2 kubenswrapper[4762]: I1014 13:12:01.505579 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:12:01.568782 master-2 kubenswrapper[4762]: I1014 13:12:01.568731 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:12:01.568782 master-2 kubenswrapper[4762]: I1014 13:12:01.568771 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:12:01.577629 master-2 kubenswrapper[4762]: I1014 13:12:01.577511 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:12:01.580467 master-2 kubenswrapper[4762]: I1014 13:12:01.580440 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:12:01.703190 master-2 kubenswrapper[4762]: I1014 13:12:01.703090 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert\") pod \"e3b0f97c-92e0-43c7-a72a-c003f0451347\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " Oct 14 13:12:01.703442 master-2 kubenswrapper[4762]: I1014 13:12:01.703268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7rh6\" (UniqueName: \"kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6\") pod \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " Oct 14 13:12:01.703442 master-2 kubenswrapper[4762]: I1014 13:12:01.703345 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert\") pod \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " Oct 14 13:12:01.703550 master-2 kubenswrapper[4762]: I1014 13:12:01.703456 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config\") pod \"e3b0f97c-92e0-43c7-a72a-c003f0451347\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " Oct 14 13:12:01.703550 master-2 kubenswrapper[4762]: I1014 13:12:01.703513 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftcjc\" (UniqueName: \"kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc\") pod \"e3b0f97c-92e0-43c7-a72a-c003f0451347\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " Oct 14 13:12:01.703641 master-2 kubenswrapper[4762]: I1014 13:12:01.703571 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles\") pod \"e3b0f97c-92e0-43c7-a72a-c003f0451347\" (UID: \"e3b0f97c-92e0-43c7-a72a-c003f0451347\") " Oct 14 13:12:01.703687 master-2 kubenswrapper[4762]: I1014 13:12:01.703657 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config\") pod \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\" (UID: \"db8f34cd-ecf2-4682-b1fd-b8e335369cb9\") " Oct 14 13:12:01.704495 master-2 kubenswrapper[4762]: I1014 13:12:01.704420 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e3b0f97c-92e0-43c7-a72a-c003f0451347" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:12:01.704578 master-2 kubenswrapper[4762]: I1014 13:12:01.704531 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config" (OuterVolumeSpecName: "config") pod "db8f34cd-ecf2-4682-b1fd-b8e335369cb9" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:12:01.704730 master-2 kubenswrapper[4762]: I1014 13:12:01.704652 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config" (OuterVolumeSpecName: "config") pod "e3b0f97c-92e0-43c7-a72a-c003f0451347" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:12:01.706588 master-2 kubenswrapper[4762]: I1014 13:12:01.706516 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "db8f34cd-ecf2-4682-b1fd-b8e335369cb9" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:12:01.706911 master-2 kubenswrapper[4762]: I1014 13:12:01.706874 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3b0f97c-92e0-43c7-a72a-c003f0451347" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:12:01.707558 master-2 kubenswrapper[4762]: I1014 13:12:01.707500 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc" (OuterVolumeSpecName: "kube-api-access-ftcjc") pod "e3b0f97c-92e0-43c7-a72a-c003f0451347" (UID: "e3b0f97c-92e0-43c7-a72a-c003f0451347"). InnerVolumeSpecName "kube-api-access-ftcjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:01.710303 master-2 kubenswrapper[4762]: I1014 13:12:01.710272 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6" (OuterVolumeSpecName: "kube-api-access-w7rh6") pod "db8f34cd-ecf2-4682-b1fd-b8e335369cb9" (UID: "db8f34cd-ecf2-4682-b1fd-b8e335369cb9"). InnerVolumeSpecName "kube-api-access-w7rh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804603 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b0f97c-92e0-43c7-a72a-c003f0451347-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804656 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7rh6\" (UniqueName: \"kubernetes.io/projected/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-kube-api-access-w7rh6\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804673 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804685 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804698 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftcjc\" (UniqueName: \"kubernetes.io/projected/e3b0f97c-92e0-43c7-a72a-c003f0451347-kube-api-access-ftcjc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804710 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-proxy-ca-bundles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:01.804764 master-2 kubenswrapper[4762]: I1014 13:12:01.804722 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:02.575396 master-2 kubenswrapper[4762]: I1014 13:12:02.575301 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55bcd8787f-4krnt" Oct 14 13:12:02.576378 master-2 kubenswrapper[4762]: I1014 13:12:02.575281 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-2" event={"ID":"df774fb4-463b-43ef-bdff-0525e6ca4c1a","Type":"ContainerStarted","Data":"9c906c3a11adeb37d19047c2bdc2a82a58c61581ea5806d44d5e372fbe0dc4ef"} Oct 14 13:12:02.576378 master-2 kubenswrapper[4762]: I1014 13:12:02.575481 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv" Oct 14 13:12:02.614783 master-2 kubenswrapper[4762]: I1014 13:12:02.614674 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-2" podStartSLOduration=1.9604929119999999 podStartE2EDuration="4.614651732s" podCreationTimestamp="2025-10-14 13:11:58 +0000 UTC" firstStartedPulling="2025-10-14 13:11:59.194334994 +0000 UTC m=+348.438494163" lastFinishedPulling="2025-10-14 13:12:01.848493804 +0000 UTC m=+351.092652983" observedRunningTime="2025-10-14 13:12:02.61149312 +0000 UTC m=+351.855652319" watchObservedRunningTime="2025-10-14 13:12:02.614651732 +0000 UTC m=+351.858810931" Oct 14 13:12:02.650569 master-2 kubenswrapper[4762]: I1014 13:12:02.650485 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:02.651724 master-2 kubenswrapper[4762]: I1014 13:12:02.651676 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv"] Oct 14 13:12:02.651846 master-2 kubenswrapper[4762]: I1014 13:12:02.651806 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.657150 master-2 kubenswrapper[4762]: I1014 13:12:02.655772 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:12:02.657150 master-2 kubenswrapper[4762]: I1014 13:12:02.655780 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:12:02.657150 master-2 kubenswrapper[4762]: I1014 13:12:02.656102 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:12:02.657150 master-2 kubenswrapper[4762]: I1014 13:12:02.656103 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:12:02.657150 master-2 kubenswrapper[4762]: I1014 13:12:02.656661 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:12:02.658732 master-2 kubenswrapper[4762]: I1014 13:12:02.658679 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f6c689d49-xd4xv"] Oct 14 13:12:02.672425 master-2 kubenswrapper[4762]: I1014 13:12:02.672374 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:02.710696 master-2 kubenswrapper[4762]: I1014 13:12:02.710619 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55bcd8787f-4krnt"] Oct 14 13:12:02.714429 master-2 kubenswrapper[4762]: I1014 13:12:02.714372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.714765 master-2 kubenswrapper[4762]: I1014 13:12:02.714477 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrk7\" (UniqueName: \"kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.714765 master-2 kubenswrapper[4762]: I1014 13:12:02.714550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.714765 master-2 kubenswrapper[4762]: I1014 13:12:02.714625 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.714765 master-2 kubenswrapper[4762]: I1014 13:12:02.714711 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db8f34cd-ecf2-4682-b1fd-b8e335369cb9-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:02.717327 master-2 kubenswrapper[4762]: I1014 13:12:02.717282 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55bcd8787f-4krnt"] Oct 14 13:12:02.815781 master-2 kubenswrapper[4762]: I1014 13:12:02.815685 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.816048 master-2 kubenswrapper[4762]: I1014 13:12:02.815788 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrk7\" (UniqueName: \"kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.816048 master-2 kubenswrapper[4762]: I1014 13:12:02.815841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.816048 master-2 kubenswrapper[4762]: I1014 13:12:02.815896 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.816048 master-2 kubenswrapper[4762]: I1014 13:12:02.815990 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b0f97c-92e0-43c7-a72a-c003f0451347-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:02.816781 master-2 kubenswrapper[4762]: I1014 13:12:02.816735 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.818244 master-2 kubenswrapper[4762]: I1014 13:12:02.818149 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.821030 master-2 kubenswrapper[4762]: I1014 13:12:02.820967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.847281 master-2 kubenswrapper[4762]: I1014 13:12:02.847125 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrk7\" (UniqueName: \"kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7\") pod \"route-controller-manager-67b9857c45-pxqsr\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:02.976458 master-2 kubenswrapper[4762]: I1014 13:12:02.976306 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:03.482533 master-2 kubenswrapper[4762]: I1014 13:12:03.482276 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:03.488915 master-2 kubenswrapper[4762]: W1014 13:12:03.488828 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4118b6e_efca_4fe8_b8a5_9356e039c50b.slice/crio-63fc759a44ac6b15a34713c6563b51160527ff84c8d7a7e747ee648725a526b4 WatchSource:0}: Error finding container 63fc759a44ac6b15a34713c6563b51160527ff84c8d7a7e747ee648725a526b4: Status 404 returned error can't find the container with id 63fc759a44ac6b15a34713c6563b51160527ff84c8d7a7e747ee648725a526b4 Oct 14 13:12:03.557025 master-2 kubenswrapper[4762]: I1014 13:12:03.556946 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8f34cd-ecf2-4682-b1fd-b8e335369cb9" path="/var/lib/kubelet/pods/db8f34cd-ecf2-4682-b1fd-b8e335369cb9/volumes" Oct 14 13:12:03.557868 master-2 kubenswrapper[4762]: I1014 13:12:03.557822 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b0f97c-92e0-43c7-a72a-c003f0451347" path="/var/lib/kubelet/pods/e3b0f97c-92e0-43c7-a72a-c003f0451347/volumes" Oct 14 13:12:03.583484 master-2 kubenswrapper[4762]: I1014 13:12:03.583392 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" event={"ID":"d4118b6e-efca-4fe8-b8a5-9356e039c50b","Type":"ContainerStarted","Data":"63fc759a44ac6b15a34713c6563b51160527ff84c8d7a7e747ee648725a526b4"} Oct 14 13:12:04.528801 master-2 kubenswrapper[4762]: I1014 13:12:04.528719 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:12:04.529463 master-2 kubenswrapper[4762]: I1014 13:12:04.529423 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.533128 master-2 kubenswrapper[4762]: I1014 13:12:04.533058 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:12:04.533416 master-2 kubenswrapper[4762]: I1014 13:12:04.533319 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:12:04.533416 master-2 kubenswrapper[4762]: I1014 13:12:04.533336 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:12:04.533636 master-2 kubenswrapper[4762]: I1014 13:12:04.533492 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:12:04.533636 master-2 kubenswrapper[4762]: I1014 13:12:04.533425 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:12:04.539494 master-2 kubenswrapper[4762]: I1014 13:12:04.539386 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.539494 master-2 kubenswrapper[4762]: I1014 13:12:04.539474 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.540134 master-2 kubenswrapper[4762]: I1014 13:12:04.539576 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.540134 master-2 kubenswrapper[4762]: I1014 13:12:04.539650 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdhm\" (UniqueName: \"kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.540134 master-2 kubenswrapper[4762]: I1014 13:12:04.539695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.541039 master-2 kubenswrapper[4762]: I1014 13:12:04.540821 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:12:04.542309 master-2 kubenswrapper[4762]: I1014 13:12:04.542250 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:12:04.596249 master-2 kubenswrapper[4762]: I1014 13:12:04.593047 4762 generic.go:334] "Generic (PLEG): container finished" podID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerID="55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047" exitCode=0 Oct 14 13:12:04.596249 master-2 kubenswrapper[4762]: I1014 13:12:04.593110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerDied","Data":"55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047"} Oct 14 13:12:04.596249 master-2 kubenswrapper[4762]: I1014 13:12:04.593144 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerStarted","Data":"0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909"} Oct 14 13:12:04.641472 master-2 kubenswrapper[4762]: I1014 13:12:04.641397 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.641679 master-2 kubenswrapper[4762]: I1014 13:12:04.641522 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.641679 master-2 kubenswrapper[4762]: I1014 13:12:04.641552 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.641679 master-2 kubenswrapper[4762]: I1014 13:12:04.641587 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.641679 master-2 kubenswrapper[4762]: I1014 13:12:04.641652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdhm\" (UniqueName: \"kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.643037 master-2 kubenswrapper[4762]: I1014 13:12:04.642991 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.643083 master-2 kubenswrapper[4762]: I1014 13:12:04.642993 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.644094 master-2 kubenswrapper[4762]: I1014 13:12:04.644014 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.646465 master-2 kubenswrapper[4762]: I1014 13:12:04.646428 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.663000 master-2 kubenswrapper[4762]: I1014 13:12:04.662952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdhm\" (UniqueName: \"kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm\") pod \"controller-manager-56cfb99cfd-rq5ck\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.876394 master-2 kubenswrapper[4762]: I1014 13:12:04.876209 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:04.895189 master-2 kubenswrapper[4762]: I1014 13:12:04.895110 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:12:04.899703 master-2 kubenswrapper[4762]: I1014 13:12:04.899656 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:04.899703 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:04.899703 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:04.899703 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:04.900068 master-2 kubenswrapper[4762]: I1014 13:12:04.899738 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:05.548858 master-2 kubenswrapper[4762]: I1014 13:12:05.548799 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:12:05.600057 master-2 kubenswrapper[4762]: I1014 13:12:05.599940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" event={"ID":"d4118b6e-efca-4fe8-b8a5-9356e039c50b","Type":"ContainerStarted","Data":"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89"} Oct 14 13:12:05.667062 master-2 kubenswrapper[4762]: I1014 13:12:05.666949 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" podStartSLOduration=2.805550406 podStartE2EDuration="4.666921248s" podCreationTimestamp="2025-10-14 13:12:01 +0000 UTC" firstStartedPulling="2025-10-14 13:12:03.49164399 +0000 UTC m=+352.735803149" lastFinishedPulling="2025-10-14 13:12:05.353014812 +0000 UTC m=+354.597173991" observedRunningTime="2025-10-14 13:12:05.665243203 +0000 UTC m=+354.909402372" watchObservedRunningTime="2025-10-14 13:12:05.666921248 +0000 UTC m=+354.911080447" Oct 14 13:12:05.736284 master-2 kubenswrapper[4762]: I1014 13:12:05.736224 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:12:05.744374 master-2 kubenswrapper[4762]: W1014 13:12:05.744317 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f29de4_fd52_45da_a0d9_b9cb67146af1.slice/crio-5fa1d42af7af58971b70d518787f54e6878842e22b5a3b27be7370a0308b47fa WatchSource:0}: Error finding container 5fa1d42af7af58971b70d518787f54e6878842e22b5a3b27be7370a0308b47fa: Status 404 returned error can't find the container with id 5fa1d42af7af58971b70d518787f54e6878842e22b5a3b27be7370a0308b47fa Oct 14 13:12:05.894946 master-2 kubenswrapper[4762]: I1014 13:12:05.894865 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:12:05.900841 master-2 kubenswrapper[4762]: I1014 13:12:05.900785 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:05.900841 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:05.900841 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:05.900841 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:05.901378 master-2 kubenswrapper[4762]: I1014 13:12:05.900854 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:06.506435 master-2 kubenswrapper[4762]: I1014 13:12:06.506341 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:12:06.506700 master-2 kubenswrapper[4762]: I1014 13:12:06.506466 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:12:06.607879 master-2 kubenswrapper[4762]: I1014 13:12:06.607777 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" event={"ID":"10f29de4-fd52-45da-a0d9-b9cb67146af1","Type":"ContainerStarted","Data":"5fa1d42af7af58971b70d518787f54e6878842e22b5a3b27be7370a0308b47fa"} Oct 14 13:12:06.610264 master-2 kubenswrapper[4762]: I1014 13:12:06.610219 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:12:06.614571 master-2 kubenswrapper[4762]: I1014 13:12:06.614513 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"c492168afa20f49cb6e3534e1871011b","Type":"ContainerStarted","Data":"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de"} Oct 14 13:12:06.619480 master-2 kubenswrapper[4762]: I1014 13:12:06.619294 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:06.626199 master-2 kubenswrapper[4762]: I1014 13:12:06.626113 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:06.655490 master-2 kubenswrapper[4762]: I1014 13:12:06.655358 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-2" podStartSLOduration=24.655338582 podStartE2EDuration="24.655338582s" podCreationTimestamp="2025-10-14 13:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:06.652517371 +0000 UTC m=+355.896676620" watchObservedRunningTime="2025-10-14 13:12:06.655338582 +0000 UTC m=+355.899497751" Oct 14 13:12:06.898209 master-2 kubenswrapper[4762]: I1014 13:12:06.898014 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:06.898209 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:06.898209 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:06.898209 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:06.898209 master-2 kubenswrapper[4762]: I1014 13:12:06.898137 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:07.487666 master-2 kubenswrapper[4762]: I1014 13:12:07.487582 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-2" Oct 14 13:12:07.897224 master-2 kubenswrapper[4762]: I1014 13:12:07.897056 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:07.897224 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:07.897224 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:07.897224 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:07.897224 master-2 kubenswrapper[4762]: I1014 13:12:07.897163 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:08.233565 master-2 kubenswrapper[4762]: I1014 13:12:08.233254 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:12:08.620979 master-2 kubenswrapper[4762]: I1014 13:12:08.620757 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-2"] Oct 14 13:12:08.621589 master-2 kubenswrapper[4762]: I1014 13:12:08.621530 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.625485 master-2 kubenswrapper[4762]: I1014 13:12:08.625437 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 14 13:12:08.625915 master-2 kubenswrapper[4762]: I1014 13:12:08.625862 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" event={"ID":"10f29de4-fd52-45da-a0d9-b9cb67146af1","Type":"ContainerStarted","Data":"5766e6ab7e0fbe733c4c8f035d22a06c9a1742f998c96cf3b6ae13fa635e1fdd"} Oct 14 13:12:08.626315 master-2 kubenswrapper[4762]: I1014 13:12:08.626265 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:08.634353 master-2 kubenswrapper[4762]: I1014 13:12:08.634282 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:12:08.636640 master-2 kubenswrapper[4762]: I1014 13:12:08.636559 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-2"] Oct 14 13:12:08.686996 master-2 kubenswrapper[4762]: I1014 13:12:08.686852 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" podStartSLOduration=5.122515247 podStartE2EDuration="7.686820156s" podCreationTimestamp="2025-10-14 13:12:01 +0000 UTC" firstStartedPulling="2025-10-14 13:12:05.746868158 +0000 UTC m=+354.991027317" lastFinishedPulling="2025-10-14 13:12:08.311173057 +0000 UTC m=+357.555332226" observedRunningTime="2025-10-14 13:12:08.682448776 +0000 UTC m=+357.926607945" watchObservedRunningTime="2025-10-14 13:12:08.686820156 +0000 UTC m=+357.930979345" Oct 14 13:12:08.695450 master-2 kubenswrapper[4762]: I1014 13:12:08.695319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.696815 master-2 kubenswrapper[4762]: I1014 13:12:08.696762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.696931 master-2 kubenswrapper[4762]: I1014 13:12:08.696821 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.797448 master-2 kubenswrapper[4762]: I1014 13:12:08.797318 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.797779 master-2 kubenswrapper[4762]: I1014 13:12:08.797459 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.797779 master-2 kubenswrapper[4762]: I1014 13:12:08.797551 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.797779 master-2 kubenswrapper[4762]: I1014 13:12:08.797742 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.797999 master-2 kubenswrapper[4762]: I1014 13:12:08.797460 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.832686 master-2 kubenswrapper[4762]: I1014 13:12:08.832610 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access\") pod \"installer-1-master-2\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:08.897680 master-2 kubenswrapper[4762]: I1014 13:12:08.897483 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:08.897680 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:08.897680 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:08.897680 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:08.897680 master-2 kubenswrapper[4762]: I1014 13:12:08.897563 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:08.946803 master-2 kubenswrapper[4762]: I1014 13:12:08.946721 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:09.447813 master-2 kubenswrapper[4762]: I1014 13:12:09.447742 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-2"] Oct 14 13:12:09.456296 master-2 kubenswrapper[4762]: W1014 13:12:09.456238 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d1479cd_b121_44d6_af25_3bc9b573c89f.slice/crio-8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29 WatchSource:0}: Error finding container 8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29: Status 404 returned error can't find the container with id 8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29 Oct 14 13:12:09.636654 master-2 kubenswrapper[4762]: I1014 13:12:09.636582 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-2" event={"ID":"8d1479cd-b121-44d6-af25-3bc9b573c89f","Type":"ContainerStarted","Data":"8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29"} Oct 14 13:12:09.848857 master-2 kubenswrapper[4762]: I1014 13:12:09.848246 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:09.848857 master-2 kubenswrapper[4762]: I1014 13:12:09.848617 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" podUID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" containerName="route-controller-manager" containerID="cri-o://023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89" gracePeriod=30 Oct 14 13:12:09.898965 master-2 kubenswrapper[4762]: I1014 13:12:09.898888 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:09.898965 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:09.898965 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:09.898965 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:09.899740 master-2 kubenswrapper[4762]: I1014 13:12:09.898967 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:10.068728 master-2 kubenswrapper[4762]: I1014 13:12:10.068677 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:12:10.069422 master-2 kubenswrapper[4762]: I1014 13:12:10.069354 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kpbmd" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="registry-server" containerID="cri-o://63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d" gracePeriod=2 Oct 14 13:12:10.105167 master-2 kubenswrapper[4762]: I1014 13:12:10.105108 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:12:10.272501 master-2 kubenswrapper[4762]: I1014 13:12:10.272462 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:12:10.273304 master-2 kubenswrapper[4762]: I1014 13:12:10.273278 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cf69d" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="registry-server" containerID="cri-o://81aec3b49a476404f7dc250f33c9194087a82b2bf5ea44822de93e2042a91529" gracePeriod=2 Oct 14 13:12:10.300624 master-2 kubenswrapper[4762]: I1014 13:12:10.300578 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.419207 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrk7\" (UniqueName: \"kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7\") pod \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.419348 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config\") pod \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.419399 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert\") pod \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.419502 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca\") pod \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\" (UID: \"d4118b6e-efca-4fe8-b8a5-9356e039c50b\") " Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.420507 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config" (OuterVolumeSpecName: "config") pod "d4118b6e-efca-4fe8-b8a5-9356e039c50b" (UID: "d4118b6e-efca-4fe8-b8a5-9356e039c50b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.420566 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4118b6e-efca-4fe8-b8a5-9356e039c50b" (UID: "d4118b6e-efca-4fe8-b8a5-9356e039c50b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.429018 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7" (OuterVolumeSpecName: "kube-api-access-7zrk7") pod "d4118b6e-efca-4fe8-b8a5-9356e039c50b" (UID: "d4118b6e-efca-4fe8-b8a5-9356e039c50b"). InnerVolumeSpecName "kube-api-access-7zrk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:10.433220 master-2 kubenswrapper[4762]: I1014 13:12:10.429615 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4118b6e-efca-4fe8-b8a5-9356e039c50b" (UID: "d4118b6e-efca-4fe8-b8a5-9356e039c50b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:12:10.480498 master-2 kubenswrapper[4762]: I1014 13:12:10.480407 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-629l7"] Oct 14 13:12:10.480643 master-2 kubenswrapper[4762]: E1014 13:12:10.480608 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" containerName="route-controller-manager" Oct 14 13:12:10.480643 master-2 kubenswrapper[4762]: I1014 13:12:10.480626 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" containerName="route-controller-manager" Oct 14 13:12:10.480737 master-2 kubenswrapper[4762]: I1014 13:12:10.480711 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" containerName="route-controller-manager" Oct 14 13:12:10.481294 master-2 kubenswrapper[4762]: I1014 13:12:10.481261 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.495476 master-2 kubenswrapper[4762]: I1014 13:12:10.487764 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-lrzch" Oct 14 13:12:10.498485 master-2 kubenswrapper[4762]: I1014 13:12:10.498428 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-629l7"] Oct 14 13:12:10.507482 master-2 kubenswrapper[4762]: I1014 13:12:10.507420 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:12:10.520869 master-2 kubenswrapper[4762]: I1014 13:12:10.520835 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrk7\" (UniqueName: \"kubernetes.io/projected/d4118b6e-efca-4fe8-b8a5-9356e039c50b-kube-api-access-7zrk7\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.520991 master-2 kubenswrapper[4762]: I1014 13:12:10.520880 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.520991 master-2 kubenswrapper[4762]: I1014 13:12:10.520891 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4118b6e-efca-4fe8-b8a5-9356e039c50b-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.520991 master-2 kubenswrapper[4762]: I1014 13:12:10.520901 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4118b6e-efca-4fe8-b8a5-9356e039c50b-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.622400 master-2 kubenswrapper[4762]: I1014 13:12:10.622329 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities\") pod \"de57a213-4820-46c7-9506-4c3ea762d75f\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " Oct 14 13:12:10.622667 master-2 kubenswrapper[4762]: I1014 13:12:10.622636 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9f4f\" (UniqueName: \"kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f\") pod \"de57a213-4820-46c7-9506-4c3ea762d75f\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " Oct 14 13:12:10.622743 master-2 kubenswrapper[4762]: I1014 13:12:10.622706 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content\") pod \"de57a213-4820-46c7-9506-4c3ea762d75f\" (UID: \"de57a213-4820-46c7-9506-4c3ea762d75f\") " Oct 14 13:12:10.623122 master-2 kubenswrapper[4762]: I1014 13:12:10.623000 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-catalog-content\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.623258 master-2 kubenswrapper[4762]: I1014 13:12:10.623134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhsx6\" (UniqueName: \"kubernetes.io/projected/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-kube-api-access-zhsx6\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.623320 master-2 kubenswrapper[4762]: I1014 13:12:10.623274 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-utilities\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.623947 master-2 kubenswrapper[4762]: I1014 13:12:10.623860 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities" (OuterVolumeSpecName: "utilities") pod "de57a213-4820-46c7-9506-4c3ea762d75f" (UID: "de57a213-4820-46c7-9506-4c3ea762d75f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:10.642544 master-2 kubenswrapper[4762]: I1014 13:12:10.633614 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f" (OuterVolumeSpecName: "kube-api-access-r9f4f") pod "de57a213-4820-46c7-9506-4c3ea762d75f" (UID: "de57a213-4820-46c7-9506-4c3ea762d75f"). InnerVolumeSpecName "kube-api-access-r9f4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:10.646251 master-2 kubenswrapper[4762]: I1014 13:12:10.644722 4762 generic.go:334] "Generic (PLEG): container finished" podID="de57a213-4820-46c7-9506-4c3ea762d75f" containerID="63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d" exitCode=0 Oct 14 13:12:10.646251 master-2 kubenswrapper[4762]: I1014 13:12:10.644838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerDied","Data":"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d"} Oct 14 13:12:10.646251 master-2 kubenswrapper[4762]: I1014 13:12:10.644877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpbmd" event={"ID":"de57a213-4820-46c7-9506-4c3ea762d75f","Type":"ContainerDied","Data":"07e46b54a3e0906bcab79074e6145a6ef11a5bc4159bce600eaa3f419715d2d5"} Oct 14 13:12:10.646251 master-2 kubenswrapper[4762]: I1014 13:12:10.644923 4762 scope.go:117] "RemoveContainer" containerID="63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d" Oct 14 13:12:10.646251 master-2 kubenswrapper[4762]: I1014 13:12:10.645134 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpbmd" Oct 14 13:12:10.649254 master-2 kubenswrapper[4762]: I1014 13:12:10.649204 4762 generic.go:334] "Generic (PLEG): container finished" podID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerID="81aec3b49a476404f7dc250f33c9194087a82b2bf5ea44822de93e2042a91529" exitCode=0 Oct 14 13:12:10.649602 master-2 kubenswrapper[4762]: I1014 13:12:10.649280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerDied","Data":"81aec3b49a476404f7dc250f33c9194087a82b2bf5ea44822de93e2042a91529"} Oct 14 13:12:10.651058 master-2 kubenswrapper[4762]: I1014 13:12:10.650953 4762 generic.go:334] "Generic (PLEG): container finished" podID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" containerID="023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89" exitCode=0 Oct 14 13:12:10.651176 master-2 kubenswrapper[4762]: I1014 13:12:10.651073 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" event={"ID":"d4118b6e-efca-4fe8-b8a5-9356e039c50b","Type":"ContainerDied","Data":"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89"} Oct 14 13:12:10.651176 master-2 kubenswrapper[4762]: I1014 13:12:10.651136 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" event={"ID":"d4118b6e-efca-4fe8-b8a5-9356e039c50b","Type":"ContainerDied","Data":"63fc759a44ac6b15a34713c6563b51160527ff84c8d7a7e747ee648725a526b4"} Oct 14 13:12:10.651272 master-2 kubenswrapper[4762]: I1014 13:12:10.651232 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr" Oct 14 13:12:10.656138 master-2 kubenswrapper[4762]: I1014 13:12:10.655068 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-2" event={"ID":"8d1479cd-b121-44d6-af25-3bc9b573c89f","Type":"ContainerStarted","Data":"97643b9390c296873cc51961bb7ec70a80fba35c5e5c0df557ef033fe557b704"} Oct 14 13:12:10.658685 master-2 kubenswrapper[4762]: I1014 13:12:10.658654 4762 scope.go:117] "RemoveContainer" containerID="0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241" Oct 14 13:12:10.668939 master-2 kubenswrapper[4762]: I1014 13:12:10.668869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de57a213-4820-46c7-9506-4c3ea762d75f" (UID: "de57a213-4820-46c7-9506-4c3ea762d75f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:10.682633 master-2 kubenswrapper[4762]: I1014 13:12:10.682563 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7flhc"] Oct 14 13:12:10.682926 master-2 kubenswrapper[4762]: E1014 13:12:10.682883 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="registry-server" Oct 14 13:12:10.682926 master-2 kubenswrapper[4762]: I1014 13:12:10.682919 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="registry-server" Oct 14 13:12:10.683274 master-2 kubenswrapper[4762]: E1014 13:12:10.683182 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="extract-utilities" Oct 14 13:12:10.683274 master-2 kubenswrapper[4762]: I1014 13:12:10.683199 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="extract-utilities" Oct 14 13:12:10.683274 master-2 kubenswrapper[4762]: E1014 13:12:10.683229 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="extract-content" Oct 14 13:12:10.683274 master-2 kubenswrapper[4762]: I1014 13:12:10.683260 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="extract-content" Oct 14 13:12:10.683432 master-2 kubenswrapper[4762]: I1014 13:12:10.683375 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" containerName="registry-server" Oct 14 13:12:10.684605 master-2 kubenswrapper[4762]: I1014 13:12:10.684565 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.685800 master-2 kubenswrapper[4762]: I1014 13:12:10.685722 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-2" podStartSLOduration=2.685701198 podStartE2EDuration="2.685701198s" podCreationTimestamp="2025-10-14 13:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:10.678442424 +0000 UTC m=+359.922601603" watchObservedRunningTime="2025-10-14 13:12:10.685701198 +0000 UTC m=+359.929860367" Oct 14 13:12:10.688254 master-2 kubenswrapper[4762]: I1014 13:12:10.688186 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-hqt82" Oct 14 13:12:10.706939 master-2 kubenswrapper[4762]: I1014 13:12:10.706845 4762 scope.go:117] "RemoveContainer" containerID="a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91" Oct 14 13:12:10.707833 master-2 kubenswrapper[4762]: I1014 13:12:10.707746 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:12:10.709939 master-2 kubenswrapper[4762]: I1014 13:12:10.709822 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7flhc"] Oct 14 13:12:10.725284 master-2 kubenswrapper[4762]: I1014 13:12:10.725232 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities\") pod \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " Oct 14 13:12:10.725373 master-2 kubenswrapper[4762]: I1014 13:12:10.725304 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpx44\" (UniqueName: \"kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44\") pod \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " Oct 14 13:12:10.725373 master-2 kubenswrapper[4762]: I1014 13:12:10.725351 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content\") pod \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\" (UID: \"4ffcca20-3c44-4e24-92c8-15d3dc0625e4\") " Oct 14 13:12:10.725477 master-2 kubenswrapper[4762]: I1014 13:12:10.725448 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-catalog-content\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.725543 master-2 kubenswrapper[4762]: I1014 13:12:10.725492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-catalog-content\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.725543 master-2 kubenswrapper[4762]: I1014 13:12:10.725524 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwkf\" (UniqueName: \"kubernetes.io/projected/8758d788-dfc7-45cc-985e-3df1534fb2c9-kube-api-access-mpwkf\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.725679 master-2 kubenswrapper[4762]: I1014 13:12:10.725562 4762 scope.go:117] "RemoveContainer" containerID="63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d" Oct 14 13:12:10.725679 master-2 kubenswrapper[4762]: I1014 13:12:10.725596 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhsx6\" (UniqueName: \"kubernetes.io/projected/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-kube-api-access-zhsx6\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.725980 master-2 kubenswrapper[4762]: I1014 13:12:10.725916 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-utilities\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.726052 master-2 kubenswrapper[4762]: I1014 13:12:10.725996 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-utilities\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.726542 master-2 kubenswrapper[4762]: I1014 13:12:10.726397 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9f4f\" (UniqueName: \"kubernetes.io/projected/de57a213-4820-46c7-9506-4c3ea762d75f-kube-api-access-r9f4f\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.726788 master-2 kubenswrapper[4762]: I1014 13:12:10.726663 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.726788 master-2 kubenswrapper[4762]: I1014 13:12:10.726708 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de57a213-4820-46c7-9506-4c3ea762d75f-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.727453 master-2 kubenswrapper[4762]: E1014 13:12:10.727351 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d\": container with ID starting with 63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d not found: ID does not exist" containerID="63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d" Oct 14 13:12:10.727453 master-2 kubenswrapper[4762]: I1014 13:12:10.727410 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d"} err="failed to get container status \"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d\": rpc error: code = NotFound desc = could not find container \"63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d\": container with ID starting with 63bcefd8ebc425f66b4390b63162f41940c95f62da6bcf268fa6a40f5511559d not found: ID does not exist" Oct 14 13:12:10.727453 master-2 kubenswrapper[4762]: I1014 13:12:10.727448 4762 scope.go:117] "RemoveContainer" containerID="0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241" Oct 14 13:12:10.727986 master-2 kubenswrapper[4762]: E1014 13:12:10.727913 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241\": container with ID starting with 0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241 not found: ID does not exist" containerID="0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241" Oct 14 13:12:10.727986 master-2 kubenswrapper[4762]: I1014 13:12:10.727946 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241"} err="failed to get container status \"0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241\": rpc error: code = NotFound desc = could not find container \"0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241\": container with ID starting with 0369989a7890d7012a8fa4ace4982cbc10434e9d603a534c21ab06c2c323b241 not found: ID does not exist" Oct 14 13:12:10.727986 master-2 kubenswrapper[4762]: I1014 13:12:10.727964 4762 scope.go:117] "RemoveContainer" containerID="a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91" Oct 14 13:12:10.728434 master-2 kubenswrapper[4762]: E1014 13:12:10.728327 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91\": container with ID starting with a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91 not found: ID does not exist" containerID="a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91" Oct 14 13:12:10.728434 master-2 kubenswrapper[4762]: I1014 13:12:10.728367 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91"} err="failed to get container status \"a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91\": rpc error: code = NotFound desc = could not find container \"a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91\": container with ID starting with a1afc75f8784267a38b3ea0bb01ba75e05baaa4a0fd1249098b107d427546e91 not found: ID does not exist" Oct 14 13:12:10.728434 master-2 kubenswrapper[4762]: I1014 13:12:10.728387 4762 scope.go:117] "RemoveContainer" containerID="023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89" Oct 14 13:12:10.736750 master-2 kubenswrapper[4762]: I1014 13:12:10.736701 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:10.739538 master-2 kubenswrapper[4762]: I1014 13:12:10.739296 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities" (OuterVolumeSpecName: "utilities") pod "4ffcca20-3c44-4e24-92c8-15d3dc0625e4" (UID: "4ffcca20-3c44-4e24-92c8-15d3dc0625e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:10.740175 master-2 kubenswrapper[4762]: I1014 13:12:10.740093 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-catalog-content\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.740175 master-2 kubenswrapper[4762]: I1014 13:12:10.740115 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-utilities\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.742829 master-2 kubenswrapper[4762]: I1014 13:12:10.742727 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44" (OuterVolumeSpecName: "kube-api-access-qpx44") pod "4ffcca20-3c44-4e24-92c8-15d3dc0625e4" (UID: "4ffcca20-3c44-4e24-92c8-15d3dc0625e4"). InnerVolumeSpecName "kube-api-access-qpx44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:10.745124 master-2 kubenswrapper[4762]: I1014 13:12:10.745066 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b9857c45-pxqsr"] Oct 14 13:12:10.749225 master-2 kubenswrapper[4762]: I1014 13:12:10.749138 4762 scope.go:117] "RemoveContainer" containerID="023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89" Oct 14 13:12:10.749845 master-2 kubenswrapper[4762]: E1014 13:12:10.749791 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89\": container with ID starting with 023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89 not found: ID does not exist" containerID="023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89" Oct 14 13:12:10.749910 master-2 kubenswrapper[4762]: I1014 13:12:10.749852 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89"} err="failed to get container status \"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89\": rpc error: code = NotFound desc = could not find container \"023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89\": container with ID starting with 023d89f57e00998dafa3207f7ad9423999a8fc9588a40bc50b74d64112f21b89 not found: ID does not exist" Oct 14 13:12:10.753443 master-2 kubenswrapper[4762]: I1014 13:12:10.753400 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhsx6\" (UniqueName: \"kubernetes.io/projected/6c9dd1f9-3220-4d79-b376-598b14c8e5e7-kube-api-access-zhsx6\") pod \"certified-operators-629l7\" (UID: \"6c9dd1f9-3220-4d79-b376-598b14c8e5e7\") " pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.801531 master-2 kubenswrapper[4762]: I1014 13:12:10.801368 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ffcca20-3c44-4e24-92c8-15d3dc0625e4" (UID: "4ffcca20-3c44-4e24-92c8-15d3dc0625e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:10.820964 master-2 kubenswrapper[4762]: I1014 13:12:10.820800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:10.827872 master-2 kubenswrapper[4762]: I1014 13:12:10.827810 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-catalog-content\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.828001 master-2 kubenswrapper[4762]: I1014 13:12:10.827937 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwkf\" (UniqueName: \"kubernetes.io/projected/8758d788-dfc7-45cc-985e-3df1534fb2c9-kube-api-access-mpwkf\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.828180 master-2 kubenswrapper[4762]: I1014 13:12:10.828100 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-utilities\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.828324 master-2 kubenswrapper[4762]: I1014 13:12:10.828294 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpx44\" (UniqueName: \"kubernetes.io/projected/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-kube-api-access-qpx44\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.828388 master-2 kubenswrapper[4762]: I1014 13:12:10.828324 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.828388 master-2 kubenswrapper[4762]: I1014 13:12:10.828346 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ffcca20-3c44-4e24-92c8-15d3dc0625e4-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:10.828688 master-2 kubenswrapper[4762]: I1014 13:12:10.828599 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-catalog-content\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.829042 master-2 kubenswrapper[4762]: I1014 13:12:10.828717 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8758d788-dfc7-45cc-985e-3df1534fb2c9-utilities\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.852377 master-2 kubenswrapper[4762]: I1014 13:12:10.852299 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwkf\" (UniqueName: \"kubernetes.io/projected/8758d788-dfc7-45cc-985e-3df1534fb2c9-kube-api-access-mpwkf\") pod \"community-operators-7flhc\" (UID: \"8758d788-dfc7-45cc-985e-3df1534fb2c9\") " pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:10.898192 master-2 kubenswrapper[4762]: I1014 13:12:10.897415 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:10.898192 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:10.898192 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:10.898192 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:10.898192 master-2 kubenswrapper[4762]: I1014 13:12:10.897490 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:10.988144 master-2 kubenswrapper[4762]: I1014 13:12:10.987998 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:12:10.993804 master-2 kubenswrapper[4762]: I1014 13:12:10.993766 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kpbmd"] Oct 14 13:12:11.022189 master-2 kubenswrapper[4762]: I1014 13:12:11.022071 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:11.291210 master-2 kubenswrapper[4762]: I1014 13:12:11.291040 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-629l7"] Oct 14 13:12:11.300517 master-2 kubenswrapper[4762]: W1014 13:12:11.300435 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c9dd1f9_3220_4d79_b376_598b14c8e5e7.slice/crio-ce4f87cd32edd789b06cb840725aba5ce62163d2bd0ef666e1b40734bb15d3f9 WatchSource:0}: Error finding container ce4f87cd32edd789b06cb840725aba5ce62163d2bd0ef666e1b40734bb15d3f9: Status 404 returned error can't find the container with id ce4f87cd32edd789b06cb840725aba5ce62163d2bd0ef666e1b40734bb15d3f9 Oct 14 13:12:11.314294 master-2 kubenswrapper[4762]: I1014 13:12:11.313644 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7flhc"] Oct 14 13:12:11.539782 master-2 kubenswrapper[4762]: I1014 13:12:11.539727 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: E1014 13:12:11.539939 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="extract-content" Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: I1014 13:12:11.539955 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="extract-content" Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: E1014 13:12:11.539976 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="registry-server" Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: I1014 13:12:11.539985 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="registry-server" Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: E1014 13:12:11.539997 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="extract-utilities" Oct 14 13:12:11.540045 master-2 kubenswrapper[4762]: I1014 13:12:11.540005 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="extract-utilities" Oct 14 13:12:11.540676 master-2 kubenswrapper[4762]: I1014 13:12:11.540097 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" containerName="registry-server" Oct 14 13:12:11.540847 master-2 kubenswrapper[4762]: I1014 13:12:11.540813 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.543330 master-2 kubenswrapper[4762]: I1014 13:12:11.543058 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:12:11.543330 master-2 kubenswrapper[4762]: I1014 13:12:11.543097 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:12:11.543330 master-2 kubenswrapper[4762]: I1014 13:12:11.543313 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:12:11.543627 master-2 kubenswrapper[4762]: I1014 13:12:11.543386 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:12:11.544057 master-2 kubenswrapper[4762]: I1014 13:12:11.543865 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:12:11.545479 master-2 kubenswrapper[4762]: I1014 13:12:11.545400 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:12:11.555986 master-2 kubenswrapper[4762]: I1014 13:12:11.555919 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4118b6e-efca-4fe8-b8a5-9356e039c50b" path="/var/lib/kubelet/pods/d4118b6e-efca-4fe8-b8a5-9356e039c50b/volumes" Oct 14 13:12:11.561206 master-2 kubenswrapper[4762]: I1014 13:12:11.558958 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de57a213-4820-46c7-9506-4c3ea762d75f" path="/var/lib/kubelet/pods/de57a213-4820-46c7-9506-4c3ea762d75f/volumes" Oct 14 13:12:11.639418 master-2 kubenswrapper[4762]: I1014 13:12:11.639372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.639579 master-2 kubenswrapper[4762]: I1014 13:12:11.639491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgz95\" (UniqueName: \"kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.639579 master-2 kubenswrapper[4762]: I1014 13:12:11.639563 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.639691 master-2 kubenswrapper[4762]: I1014 13:12:11.639663 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.660832 master-2 kubenswrapper[4762]: I1014 13:12:11.660787 4762 generic.go:334] "Generic (PLEG): container finished" podID="8758d788-dfc7-45cc-985e-3df1534fb2c9" containerID="e17d6fb08dfdb1d3dcc2f6aa3bd5bd2a446f6737fb1cd55f1b7b01e9c6ca4105" exitCode=0 Oct 14 13:12:11.661059 master-2 kubenswrapper[4762]: I1014 13:12:11.660877 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7flhc" event={"ID":"8758d788-dfc7-45cc-985e-3df1534fb2c9","Type":"ContainerDied","Data":"e17d6fb08dfdb1d3dcc2f6aa3bd5bd2a446f6737fb1cd55f1b7b01e9c6ca4105"} Oct 14 13:12:11.661059 master-2 kubenswrapper[4762]: I1014 13:12:11.660915 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7flhc" event={"ID":"8758d788-dfc7-45cc-985e-3df1534fb2c9","Type":"ContainerStarted","Data":"6e607a845b730b8ad59fec384dce28cb4cb0ca50ef5be26c2e4bbefa01c4eeff"} Oct 14 13:12:11.664701 master-2 kubenswrapper[4762]: I1014 13:12:11.664651 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cf69d" Oct 14 13:12:11.664835 master-2 kubenswrapper[4762]: I1014 13:12:11.664731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cf69d" event={"ID":"4ffcca20-3c44-4e24-92c8-15d3dc0625e4","Type":"ContainerDied","Data":"7cb0f127c4c2430366fcaf657ff99060aa4fdc3db081f400c75c2dce7b111bfc"} Oct 14 13:12:11.664835 master-2 kubenswrapper[4762]: I1014 13:12:11.664795 4762 scope.go:117] "RemoveContainer" containerID="81aec3b49a476404f7dc250f33c9194087a82b2bf5ea44822de93e2042a91529" Oct 14 13:12:11.667104 master-2 kubenswrapper[4762]: I1014 13:12:11.667025 4762 generic.go:334] "Generic (PLEG): container finished" podID="6c9dd1f9-3220-4d79-b376-598b14c8e5e7" containerID="3912d46f0aa48a6cb11b1f8e7a82dfe6090c84a5299112da6af835608b8a895e" exitCode=0 Oct 14 13:12:11.667104 master-2 kubenswrapper[4762]: I1014 13:12:11.667090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-629l7" event={"ID":"6c9dd1f9-3220-4d79-b376-598b14c8e5e7","Type":"ContainerDied","Data":"3912d46f0aa48a6cb11b1f8e7a82dfe6090c84a5299112da6af835608b8a895e"} Oct 14 13:12:11.667468 master-2 kubenswrapper[4762]: I1014 13:12:11.667129 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-629l7" event={"ID":"6c9dd1f9-3220-4d79-b376-598b14c8e5e7","Type":"ContainerStarted","Data":"ce4f87cd32edd789b06cb840725aba5ce62163d2bd0ef666e1b40734bb15d3f9"} Oct 14 13:12:11.681928 master-2 kubenswrapper[4762]: I1014 13:12:11.681888 4762 scope.go:117] "RemoveContainer" containerID="fd4f47c880f7459da4b54075d7d7b82efbefe1cf43c3d0c933cd64e3a0b2b32b" Oct 14 13:12:11.699215 master-2 kubenswrapper[4762]: I1014 13:12:11.698862 4762 scope.go:117] "RemoveContainer" containerID="a6622de65bab02e973869481309ca216d3dc68faab59179b0e0b5cb3eaa9812b" Oct 14 13:12:11.740672 master-2 kubenswrapper[4762]: I1014 13:12:11.740615 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.741011 master-2 kubenswrapper[4762]: I1014 13:12:11.740991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.741213 master-2 kubenswrapper[4762]: I1014 13:12:11.741188 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.741420 master-2 kubenswrapper[4762]: I1014 13:12:11.741387 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgz95\" (UniqueName: \"kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.744039 master-2 kubenswrapper[4762]: I1014 13:12:11.743982 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.744700 master-2 kubenswrapper[4762]: I1014 13:12:11.744654 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.745499 master-2 kubenswrapper[4762]: I1014 13:12:11.745456 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:12:11.747003 master-2 kubenswrapper[4762]: I1014 13:12:11.746980 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.752342 master-2 kubenswrapper[4762]: I1014 13:12:11.752310 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cf69d"] Oct 14 13:12:11.765245 master-2 kubenswrapper[4762]: I1014 13:12:11.765180 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgz95\" (UniqueName: \"kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95\") pod \"route-controller-manager-77674cffc8-gf5tz\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:11.898969 master-2 kubenswrapper[4762]: I1014 13:12:11.898800 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:11.898969 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:11.898969 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:11.898969 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:11.898969 master-2 kubenswrapper[4762]: I1014 13:12:11.898886 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:11.929112 master-2 kubenswrapper[4762]: I1014 13:12:11.929055 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:12.405950 master-2 kubenswrapper[4762]: I1014 13:12:12.404569 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:12:12.413901 master-2 kubenswrapper[4762]: W1014 13:12:12.413847 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod213dbdcb_5bd7_48a6_9365_1f643ea3bbea.slice/crio-7466ead3c472256701902904780dd7d9eb11f2bd98bf9d9fbc6f9e51d477a519 WatchSource:0}: Error finding container 7466ead3c472256701902904780dd7d9eb11f2bd98bf9d9fbc6f9e51d477a519: Status 404 returned error can't find the container with id 7466ead3c472256701902904780dd7d9eb11f2bd98bf9d9fbc6f9e51d477a519 Oct 14 13:12:12.488054 master-2 kubenswrapper[4762]: I1014 13:12:12.487992 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:12:12.673794 master-2 kubenswrapper[4762]: I1014 13:12:12.673722 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" event={"ID":"213dbdcb-5bd7-48a6-9365-1f643ea3bbea","Type":"ContainerStarted","Data":"3f1909d8c7190ca600beb76aceba301114ff7f52320e80833aa4a063f90a9100"} Oct 14 13:12:12.673794 master-2 kubenswrapper[4762]: I1014 13:12:12.673790 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" event={"ID":"213dbdcb-5bd7-48a6-9365-1f643ea3bbea","Type":"ContainerStarted","Data":"7466ead3c472256701902904780dd7d9eb11f2bd98bf9d9fbc6f9e51d477a519"} Oct 14 13:12:12.674135 master-2 kubenswrapper[4762]: I1014 13:12:12.674068 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:12.676747 master-2 kubenswrapper[4762]: I1014 13:12:12.676677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-629l7" event={"ID":"6c9dd1f9-3220-4d79-b376-598b14c8e5e7","Type":"ContainerStarted","Data":"1be14ac63b2e3ce6b3a2765de5720927b750969d9a04d5195642cd0b3bc4c554"} Oct 14 13:12:12.679132 master-2 kubenswrapper[4762]: I1014 13:12:12.679068 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7flhc" event={"ID":"8758d788-dfc7-45cc-985e-3df1534fb2c9","Type":"ContainerStarted","Data":"58dcbbea0b65e6b7e3bc5a1a74014b8d5209218a360a4d8bcfad5680d119a513"} Oct 14 13:12:12.733998 master-2 kubenswrapper[4762]: I1014 13:12:12.733850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" podStartSLOduration=3.7338165500000002 podStartE2EDuration="3.73381655s" podCreationTimestamp="2025-10-14 13:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:12.704834044 +0000 UTC m=+361.948993273" watchObservedRunningTime="2025-10-14 13:12:12.73381655 +0000 UTC m=+361.977975749" Oct 14 13:12:12.869451 master-2 kubenswrapper[4762]: I1014 13:12:12.869011 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:12:12.870640 master-2 kubenswrapper[4762]: I1014 13:12:12.870238 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frksz" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="registry-server" containerID="cri-o://5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099" gracePeriod=2 Oct 14 13:12:12.900532 master-2 kubenswrapper[4762]: I1014 13:12:12.900458 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:12.900532 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:12.900532 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:12.900532 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:12.900942 master-2 kubenswrapper[4762]: I1014 13:12:12.900554 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:12.980262 master-2 kubenswrapper[4762]: I1014 13:12:12.976230 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:12:13.073589 master-2 kubenswrapper[4762]: I1014 13:12:13.073533 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:12:13.074381 master-2 kubenswrapper[4762]: I1014 13:12:13.074332 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xl9gv" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="registry-server" containerID="cri-o://644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290" gracePeriod=2 Oct 14 13:12:13.300148 master-2 kubenswrapper[4762]: I1014 13:12:13.299978 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2p79c"] Oct 14 13:12:13.301384 master-2 kubenswrapper[4762]: I1014 13:12:13.301354 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.304862 master-2 kubenswrapper[4762]: I1014 13:12:13.304823 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-g85rz" Oct 14 13:12:13.317837 master-2 kubenswrapper[4762]: I1014 13:12:13.317777 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p79c"] Oct 14 13:12:13.362694 master-2 kubenswrapper[4762]: I1014 13:12:13.362124 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-catalog-content\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.362694 master-2 kubenswrapper[4762]: I1014 13:12:13.362219 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-utilities\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.365220 master-2 kubenswrapper[4762]: I1014 13:12:13.362283 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5db4l\" (UniqueName: \"kubernetes.io/projected/2911f54c-828f-410e-8b48-5d1837465fe2-kube-api-access-5db4l\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.372101 master-2 kubenswrapper[4762]: I1014 13:12:13.372052 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.466257 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content\") pod \"1458907f-e285-4301-8542-0b46ac67b02d\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.466373 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities\") pod \"1458907f-e285-4301-8542-0b46ac67b02d\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.466439 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww97\" (UniqueName: \"kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97\") pod \"1458907f-e285-4301-8542-0b46ac67b02d\" (UID: \"1458907f-e285-4301-8542-0b46ac67b02d\") " Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.466960 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-catalog-content\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.467030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-utilities\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.467090 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5db4l\" (UniqueName: \"kubernetes.io/projected/2911f54c-828f-410e-8b48-5d1837465fe2-kube-api-access-5db4l\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.467915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-catalog-content\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.468259 master-2 kubenswrapper[4762]: I1014 13:12:13.467952 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2911f54c-828f-410e-8b48-5d1837465fe2-utilities\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.469343 master-2 kubenswrapper[4762]: I1014 13:12:13.469271 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities" (OuterVolumeSpecName: "utilities") pod "1458907f-e285-4301-8542-0b46ac67b02d" (UID: "1458907f-e285-4301-8542-0b46ac67b02d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:13.480963 master-2 kubenswrapper[4762]: I1014 13:12:13.480889 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97" (OuterVolumeSpecName: "kube-api-access-6ww97") pod "1458907f-e285-4301-8542-0b46ac67b02d" (UID: "1458907f-e285-4301-8542-0b46ac67b02d"). InnerVolumeSpecName "kube-api-access-6ww97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:13.481445 master-2 kubenswrapper[4762]: I1014 13:12:13.481384 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1458907f-e285-4301-8542-0b46ac67b02d" (UID: "1458907f-e285-4301-8542-0b46ac67b02d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:13.490744 master-2 kubenswrapper[4762]: I1014 13:12:13.490691 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m2vwm"] Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: E1014 13:12:13.490858 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="extract-content" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: I1014 13:12:13.490870 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="extract-content" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: E1014 13:12:13.490880 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="extract-utilities" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: I1014 13:12:13.490886 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="extract-utilities" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: E1014 13:12:13.490895 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="registry-server" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: I1014 13:12:13.490900 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="registry-server" Oct 14 13:12:13.491012 master-2 kubenswrapper[4762]: I1014 13:12:13.490973 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458907f-e285-4301-8542-0b46ac67b02d" containerName="registry-server" Oct 14 13:12:13.491588 master-2 kubenswrapper[4762]: I1014 13:12:13.491556 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.495568 master-2 kubenswrapper[4762]: I1014 13:12:13.495514 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-lds5w" Oct 14 13:12:13.505136 master-2 kubenswrapper[4762]: I1014 13:12:13.505083 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2vwm"] Oct 14 13:12:13.508231 master-2 kubenswrapper[4762]: I1014 13:12:13.508179 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5db4l\" (UniqueName: \"kubernetes.io/projected/2911f54c-828f-410e-8b48-5d1837465fe2-kube-api-access-5db4l\") pod \"redhat-marketplace-2p79c\" (UID: \"2911f54c-828f-410e-8b48-5d1837465fe2\") " pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.554230 master-2 kubenswrapper[4762]: I1014 13:12:13.554181 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:12:13.557434 master-2 kubenswrapper[4762]: I1014 13:12:13.557389 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffcca20-3c44-4e24-92c8-15d3dc0625e4" path="/var/lib/kubelet/pods/4ffcca20-3c44-4e24-92c8-15d3dc0625e4/volumes" Oct 14 13:12:13.568233 master-2 kubenswrapper[4762]: I1014 13:12:13.568148 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content\") pod \"c551a119-e58d-46c3-9f81-7c0400c70c27\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " Oct 14 13:12:13.568558 master-2 kubenswrapper[4762]: I1014 13:12:13.568526 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5f4n\" (UniqueName: \"kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n\") pod \"c551a119-e58d-46c3-9f81-7c0400c70c27\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " Oct 14 13:12:13.568778 master-2 kubenswrapper[4762]: I1014 13:12:13.568752 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities\") pod \"c551a119-e58d-46c3-9f81-7c0400c70c27\" (UID: \"c551a119-e58d-46c3-9f81-7c0400c70c27\") " Oct 14 13:12:13.569261 master-2 kubenswrapper[4762]: I1014 13:12:13.569221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-utilities\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.569503 master-2 kubenswrapper[4762]: I1014 13:12:13.569468 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-catalog-content\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.570145 master-2 kubenswrapper[4762]: I1014 13:12:13.570105 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pqs\" (UniqueName: \"kubernetes.io/projected/b63c8f4e-054a-446c-8b1c-c9fec73416f9-kube-api-access-n2pqs\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.570456 master-2 kubenswrapper[4762]: I1014 13:12:13.570420 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.570648 master-2 kubenswrapper[4762]: I1014 13:12:13.570618 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ww97\" (UniqueName: \"kubernetes.io/projected/1458907f-e285-4301-8542-0b46ac67b02d-kube-api-access-6ww97\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.570799 master-2 kubenswrapper[4762]: I1014 13:12:13.570776 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1458907f-e285-4301-8542-0b46ac67b02d-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.570964 master-2 kubenswrapper[4762]: I1014 13:12:13.569660 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities" (OuterVolumeSpecName: "utilities") pod "c551a119-e58d-46c3-9f81-7c0400c70c27" (UID: "c551a119-e58d-46c3-9f81-7c0400c70c27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:13.571460 master-2 kubenswrapper[4762]: I1014 13:12:13.571399 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n" (OuterVolumeSpecName: "kube-api-access-g5f4n") pod "c551a119-e58d-46c3-9f81-7c0400c70c27" (UID: "c551a119-e58d-46c3-9f81-7c0400c70c27"). InnerVolumeSpecName "kube-api-access-g5f4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:13.658755 master-2 kubenswrapper[4762]: I1014 13:12:13.658498 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c551a119-e58d-46c3-9f81-7c0400c70c27" (UID: "c551a119-e58d-46c3-9f81-7c0400c70c27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:12:13.663208 master-2 kubenswrapper[4762]: I1014 13:12:13.663115 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:13.671745 master-2 kubenswrapper[4762]: I1014 13:12:13.671681 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pqs\" (UniqueName: \"kubernetes.io/projected/b63c8f4e-054a-446c-8b1c-c9fec73416f9-kube-api-access-n2pqs\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.671857 master-2 kubenswrapper[4762]: I1014 13:12:13.671779 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-utilities\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.671857 master-2 kubenswrapper[4762]: I1014 13:12:13.671808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-catalog-content\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.671857 master-2 kubenswrapper[4762]: I1014 13:12:13.671838 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.671857 master-2 kubenswrapper[4762]: I1014 13:12:13.671853 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c551a119-e58d-46c3-9f81-7c0400c70c27-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.672103 master-2 kubenswrapper[4762]: I1014 13:12:13.671867 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5f4n\" (UniqueName: \"kubernetes.io/projected/c551a119-e58d-46c3-9f81-7c0400c70c27-kube-api-access-g5f4n\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:13.672353 master-2 kubenswrapper[4762]: I1014 13:12:13.672315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-catalog-content\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.673026 master-2 kubenswrapper[4762]: I1014 13:12:13.672954 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b63c8f4e-054a-446c-8b1c-c9fec73416f9-utilities\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.689262 master-2 kubenswrapper[4762]: I1014 13:12:13.689200 4762 generic.go:334] "Generic (PLEG): container finished" podID="6c9dd1f9-3220-4d79-b376-598b14c8e5e7" containerID="1be14ac63b2e3ce6b3a2765de5720927b750969d9a04d5195642cd0b3bc4c554" exitCode=0 Oct 14 13:12:13.689411 master-2 kubenswrapper[4762]: I1014 13:12:13.689289 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-629l7" event={"ID":"6c9dd1f9-3220-4d79-b376-598b14c8e5e7","Type":"ContainerDied","Data":"1be14ac63b2e3ce6b3a2765de5720927b750969d9a04d5195642cd0b3bc4c554"} Oct 14 13:12:13.692761 master-2 kubenswrapper[4762]: I1014 13:12:13.692644 4762 generic.go:334] "Generic (PLEG): container finished" podID="8758d788-dfc7-45cc-985e-3df1534fb2c9" containerID="58dcbbea0b65e6b7e3bc5a1a74014b8d5209218a360a4d8bcfad5680d119a513" exitCode=0 Oct 14 13:12:13.692761 master-2 kubenswrapper[4762]: I1014 13:12:13.692669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7flhc" event={"ID":"8758d788-dfc7-45cc-985e-3df1534fb2c9","Type":"ContainerDied","Data":"58dcbbea0b65e6b7e3bc5a1a74014b8d5209218a360a4d8bcfad5680d119a513"} Oct 14 13:12:13.692761 master-2 kubenswrapper[4762]: I1014 13:12:13.692702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7flhc" event={"ID":"8758d788-dfc7-45cc-985e-3df1534fb2c9","Type":"ContainerStarted","Data":"3cc1871d450213596f6aca2d17fe0b5d41b4448a3756d68ced9d7cfba5e1ede5"} Oct 14 13:12:13.694776 master-2 kubenswrapper[4762]: I1014 13:12:13.694702 4762 generic.go:334] "Generic (PLEG): container finished" podID="1458907f-e285-4301-8542-0b46ac67b02d" containerID="5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099" exitCode=0 Oct 14 13:12:13.694776 master-2 kubenswrapper[4762]: I1014 13:12:13.694756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerDied","Data":"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099"} Oct 14 13:12:13.694776 master-2 kubenswrapper[4762]: I1014 13:12:13.694776 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frksz" event={"ID":"1458907f-e285-4301-8542-0b46ac67b02d","Type":"ContainerDied","Data":"953d4365c58fad8ce4ced1af6306cf2b2fb1b3f6643be9c4a60973da8ff31def"} Oct 14 13:12:13.695022 master-2 kubenswrapper[4762]: I1014 13:12:13.694797 4762 scope.go:117] "RemoveContainer" containerID="5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099" Oct 14 13:12:13.695022 master-2 kubenswrapper[4762]: I1014 13:12:13.694932 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frksz" Oct 14 13:12:13.702530 master-2 kubenswrapper[4762]: I1014 13:12:13.701765 4762 generic.go:334] "Generic (PLEG): container finished" podID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerID="644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290" exitCode=0 Oct 14 13:12:13.702530 master-2 kubenswrapper[4762]: I1014 13:12:13.701838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerDied","Data":"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290"} Oct 14 13:12:13.702530 master-2 kubenswrapper[4762]: I1014 13:12:13.701940 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xl9gv" event={"ID":"c551a119-e58d-46c3-9f81-7c0400c70c27","Type":"ContainerDied","Data":"bf214f48ac34db90286eb9ca280e23dfb931c49f0756b8ace1917791aaf453a6"} Oct 14 13:12:13.702530 master-2 kubenswrapper[4762]: I1014 13:12:13.701964 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xl9gv" Oct 14 13:12:13.705455 master-2 kubenswrapper[4762]: I1014 13:12:13.705404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pqs\" (UniqueName: \"kubernetes.io/projected/b63c8f4e-054a-446c-8b1c-c9fec73416f9-kube-api-access-n2pqs\") pod \"redhat-operators-m2vwm\" (UID: \"b63c8f4e-054a-446c-8b1c-c9fec73416f9\") " pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.723575 master-2 kubenswrapper[4762]: I1014 13:12:13.723525 4762 scope.go:117] "RemoveContainer" containerID="bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1" Oct 14 13:12:13.756915 master-2 kubenswrapper[4762]: I1014 13:12:13.756866 4762 scope.go:117] "RemoveContainer" containerID="a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.780236 4762 scope.go:117] "RemoveContainer" containerID="5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: E1014 13:12:13.780893 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099\": container with ID starting with 5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099 not found: ID does not exist" containerID="5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.780930 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099"} err="failed to get container status \"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099\": rpc error: code = NotFound desc = could not find container \"5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099\": container with ID starting with 5b8b686446edf254dda47da390b1e2779b9622d1ec4218a592f3dc2901d90099 not found: ID does not exist" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.780959 4762 scope.go:117] "RemoveContainer" containerID="bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: E1014 13:12:13.781400 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1\": container with ID starting with bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1 not found: ID does not exist" containerID="bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.781424 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1"} err="failed to get container status \"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1\": rpc error: code = NotFound desc = could not find container \"bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1\": container with ID starting with bc6a81da645ff98ca01338184fa1f436d66e46d114b78b1bda18bb8ff1b1d2f1 not found: ID does not exist" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.781440 4762 scope.go:117] "RemoveContainer" containerID="a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: E1014 13:12:13.781705 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f\": container with ID starting with a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f not found: ID does not exist" containerID="a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.781729 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f"} err="failed to get container status \"a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f\": rpc error: code = NotFound desc = could not find container \"a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f\": container with ID starting with a7b00be1b07c6e363ee5deeb0588993ca6554ece822cca2235a5c4481a84835f not found: ID does not exist" Oct 14 13:12:13.782694 master-2 kubenswrapper[4762]: I1014 13:12:13.781747 4762 scope.go:117] "RemoveContainer" containerID="644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290" Oct 14 13:12:13.810172 master-2 kubenswrapper[4762]: I1014 13:12:13.809871 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:13.827873 master-2 kubenswrapper[4762]: I1014 13:12:13.827843 4762 scope.go:117] "RemoveContainer" containerID="1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653" Oct 14 13:12:13.851870 master-2 kubenswrapper[4762]: I1014 13:12:13.851819 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:12:13.856350 master-2 kubenswrapper[4762]: I1014 13:12:13.856318 4762 scope.go:117] "RemoveContainer" containerID="232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729" Oct 14 13:12:13.873789 master-2 kubenswrapper[4762]: I1014 13:12:13.873750 4762 scope.go:117] "RemoveContainer" containerID="644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290" Oct 14 13:12:13.874257 master-2 kubenswrapper[4762]: E1014 13:12:13.874222 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290\": container with ID starting with 644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290 not found: ID does not exist" containerID="644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290" Oct 14 13:12:13.874306 master-2 kubenswrapper[4762]: I1014 13:12:13.874260 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290"} err="failed to get container status \"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290\": rpc error: code = NotFound desc = could not find container \"644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290\": container with ID starting with 644d8dce504e051b037ec15b3d58aa4f2ac49928a6e8ee3ab30a4fb802324290 not found: ID does not exist" Oct 14 13:12:13.874306 master-2 kubenswrapper[4762]: I1014 13:12:13.874286 4762 scope.go:117] "RemoveContainer" containerID="1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653" Oct 14 13:12:13.874687 master-2 kubenswrapper[4762]: E1014 13:12:13.874639 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653\": container with ID starting with 1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653 not found: ID does not exist" containerID="1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653" Oct 14 13:12:13.874731 master-2 kubenswrapper[4762]: I1014 13:12:13.874700 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653"} err="failed to get container status \"1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653\": rpc error: code = NotFound desc = could not find container \"1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653\": container with ID starting with 1f4e0dcb035cd290547efbf1e0284d8c2f5d9ee4ed21c117f62a6d8ab863d653 not found: ID does not exist" Oct 14 13:12:13.874766 master-2 kubenswrapper[4762]: I1014 13:12:13.874737 4762 scope.go:117] "RemoveContainer" containerID="232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729" Oct 14 13:12:13.875489 master-2 kubenswrapper[4762]: E1014 13:12:13.875396 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729\": container with ID starting with 232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729 not found: ID does not exist" containerID="232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729" Oct 14 13:12:13.875549 master-2 kubenswrapper[4762]: I1014 13:12:13.875493 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729"} err="failed to get container status \"232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729\": rpc error: code = NotFound desc = could not find container \"232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729\": container with ID starting with 232ba9558afacb07a7fe8978115f3e82dfa3023983c71a558160c05a5e1df729 not found: ID does not exist" Oct 14 13:12:13.887982 master-2 kubenswrapper[4762]: I1014 13:12:13.887920 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frksz"] Oct 14 13:12:13.897230 master-2 kubenswrapper[4762]: I1014 13:12:13.896882 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:13.897230 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:13.897230 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:13.897230 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:13.897230 master-2 kubenswrapper[4762]: I1014 13:12:13.896942 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:13.919014 master-2 kubenswrapper[4762]: I1014 13:12:13.918904 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7flhc" podStartSLOduration=2.4131628259999998 podStartE2EDuration="3.918876624s" podCreationTimestamp="2025-10-14 13:12:10 +0000 UTC" firstStartedPulling="2025-10-14 13:12:11.662306031 +0000 UTC m=+360.906465190" lastFinishedPulling="2025-10-14 13:12:13.168019819 +0000 UTC m=+362.412178988" observedRunningTime="2025-10-14 13:12:13.918042947 +0000 UTC m=+363.162202126" watchObservedRunningTime="2025-10-14 13:12:13.918876624 +0000 UTC m=+363.163035783" Oct 14 13:12:13.941742 master-2 kubenswrapper[4762]: I1014 13:12:13.941697 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:12:13.951227 master-2 kubenswrapper[4762]: I1014 13:12:13.950764 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xl9gv"] Oct 14 13:12:14.084231 master-2 kubenswrapper[4762]: I1014 13:12:14.083705 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2p79c"] Oct 14 13:12:14.088765 master-2 kubenswrapper[4762]: W1014 13:12:14.088703 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2911f54c_828f_410e_8b48_5d1837465fe2.slice/crio-c626197ddda022cfde23b0c81c849e124e86bbca12b13323478f48032f77ce4b WatchSource:0}: Error finding container c626197ddda022cfde23b0c81c849e124e86bbca12b13323478f48032f77ce4b: Status 404 returned error can't find the container with id c626197ddda022cfde23b0c81c849e124e86bbca12b13323478f48032f77ce4b Oct 14 13:12:14.234179 master-2 kubenswrapper[4762]: I1014 13:12:14.234037 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2vwm"] Oct 14 13:12:14.243230 master-2 kubenswrapper[4762]: W1014 13:12:14.243148 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb63c8f4e_054a_446c_8b1c_c9fec73416f9.slice/crio-8b9bb58b61f01fb694fd293f08ae527937a6b93b04022060b97e0c8206afd65b WatchSource:0}: Error finding container 8b9bb58b61f01fb694fd293f08ae527937a6b93b04022060b97e0c8206afd65b: Status 404 returned error can't find the container with id 8b9bb58b61f01fb694fd293f08ae527937a6b93b04022060b97e0c8206afd65b Oct 14 13:12:14.711585 master-2 kubenswrapper[4762]: I1014 13:12:14.711498 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2vwm" event={"ID":"b63c8f4e-054a-446c-8b1c-c9fec73416f9","Type":"ContainerStarted","Data":"78c87dfbc61cbf4d91441ca3209a60837bfa1499191ce6922ab70cfceb04ae87"} Oct 14 13:12:14.711585 master-2 kubenswrapper[4762]: I1014 13:12:14.711570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2vwm" event={"ID":"b63c8f4e-054a-446c-8b1c-c9fec73416f9","Type":"ContainerStarted","Data":"8b9bb58b61f01fb694fd293f08ae527937a6b93b04022060b97e0c8206afd65b"} Oct 14 13:12:14.713202 master-2 kubenswrapper[4762]: I1014 13:12:14.713115 4762 generic.go:334] "Generic (PLEG): container finished" podID="2911f54c-828f-410e-8b48-5d1837465fe2" containerID="c93e1678a27f57a71c07a1a7feba6010962d8178d39bed371f5949c11faba086" exitCode=0 Oct 14 13:12:14.713202 master-2 kubenswrapper[4762]: I1014 13:12:14.713191 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p79c" event={"ID":"2911f54c-828f-410e-8b48-5d1837465fe2","Type":"ContainerDied","Data":"c93e1678a27f57a71c07a1a7feba6010962d8178d39bed371f5949c11faba086"} Oct 14 13:12:14.713400 master-2 kubenswrapper[4762]: I1014 13:12:14.713237 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p79c" event={"ID":"2911f54c-828f-410e-8b48-5d1837465fe2","Type":"ContainerStarted","Data":"c626197ddda022cfde23b0c81c849e124e86bbca12b13323478f48032f77ce4b"} Oct 14 13:12:14.897937 master-2 kubenswrapper[4762]: I1014 13:12:14.897862 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:14.897937 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:14.897937 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:14.897937 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:14.898344 master-2 kubenswrapper[4762]: I1014 13:12:14.897967 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:15.561231 master-2 kubenswrapper[4762]: I1014 13:12:15.561104 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1458907f-e285-4301-8542-0b46ac67b02d" path="/var/lib/kubelet/pods/1458907f-e285-4301-8542-0b46ac67b02d/volumes" Oct 14 13:12:15.562884 master-2 kubenswrapper[4762]: I1014 13:12:15.562788 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" path="/var/lib/kubelet/pods/c551a119-e58d-46c3-9f81-7c0400c70c27/volumes" Oct 14 13:12:15.722894 master-2 kubenswrapper[4762]: I1014 13:12:15.722814 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-629l7" event={"ID":"6c9dd1f9-3220-4d79-b376-598b14c8e5e7","Type":"ContainerStarted","Data":"64d21b1891c51d417b266d6bb535489e6c7f814284dba4c51ca5e284d8164637"} Oct 14 13:12:15.725619 master-2 kubenswrapper[4762]: I1014 13:12:15.725558 4762 generic.go:334] "Generic (PLEG): container finished" podID="b63c8f4e-054a-446c-8b1c-c9fec73416f9" containerID="78c87dfbc61cbf4d91441ca3209a60837bfa1499191ce6922ab70cfceb04ae87" exitCode=0 Oct 14 13:12:15.725760 master-2 kubenswrapper[4762]: I1014 13:12:15.725669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2vwm" event={"ID":"b63c8f4e-054a-446c-8b1c-c9fec73416f9","Type":"ContainerDied","Data":"78c87dfbc61cbf4d91441ca3209a60837bfa1499191ce6922ab70cfceb04ae87"} Oct 14 13:12:15.751051 master-2 kubenswrapper[4762]: I1014 13:12:15.750968 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-629l7" podStartSLOduration=2.564199423 podStartE2EDuration="5.75094488s" podCreationTimestamp="2025-10-14 13:12:10 +0000 UTC" firstStartedPulling="2025-10-14 13:12:11.668729549 +0000 UTC m=+360.912888738" lastFinishedPulling="2025-10-14 13:12:14.855475006 +0000 UTC m=+364.099634195" observedRunningTime="2025-10-14 13:12:15.748525492 +0000 UTC m=+364.992684691" watchObservedRunningTime="2025-10-14 13:12:15.75094488 +0000 UTC m=+364.995104039" Oct 14 13:12:15.897906 master-2 kubenswrapper[4762]: I1014 13:12:15.897725 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:15.897906 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:15.897906 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:15.897906 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:15.897906 master-2 kubenswrapper[4762]: I1014 13:12:15.897822 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:16.736469 master-2 kubenswrapper[4762]: I1014 13:12:16.736380 4762 generic.go:334] "Generic (PLEG): container finished" podID="2911f54c-828f-410e-8b48-5d1837465fe2" containerID="7cdd68801b5248e4300717d5b06b81dac87289320969c151c2eee229cc4b97c8" exitCode=0 Oct 14 13:12:16.737129 master-2 kubenswrapper[4762]: I1014 13:12:16.736511 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p79c" event={"ID":"2911f54c-828f-410e-8b48-5d1837465fe2","Type":"ContainerDied","Data":"7cdd68801b5248e4300717d5b06b81dac87289320969c151c2eee229cc4b97c8"} Oct 14 13:12:16.899744 master-2 kubenswrapper[4762]: I1014 13:12:16.899593 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:16.899744 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:16.899744 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:16.899744 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:16.899744 master-2 kubenswrapper[4762]: I1014 13:12:16.899705 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:17.746559 master-2 kubenswrapper[4762]: I1014 13:12:17.746488 4762 generic.go:334] "Generic (PLEG): container finished" podID="b63c8f4e-054a-446c-8b1c-c9fec73416f9" containerID="642a4806a2b158262c52bbd7b468c5420c5d0796a95db519f6e3203a32c79aa9" exitCode=0 Oct 14 13:12:17.747106 master-2 kubenswrapper[4762]: I1014 13:12:17.746555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2vwm" event={"ID":"b63c8f4e-054a-446c-8b1c-c9fec73416f9","Type":"ContainerDied","Data":"642a4806a2b158262c52bbd7b468c5420c5d0796a95db519f6e3203a32c79aa9"} Oct 14 13:12:17.750311 master-2 kubenswrapper[4762]: I1014 13:12:17.749565 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2p79c" event={"ID":"2911f54c-828f-410e-8b48-5d1837465fe2","Type":"ContainerStarted","Data":"5bcb69f929522420772d34eba35985825f482c43c02ea729740827f5d87c6f99"} Oct 14 13:12:17.787757 master-2 kubenswrapper[4762]: I1014 13:12:17.787694 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2p79c" podStartSLOduration=3.270296688 podStartE2EDuration="4.787664512s" podCreationTimestamp="2025-10-14 13:12:13 +0000 UTC" firstStartedPulling="2025-10-14 13:12:15.727868924 +0000 UTC m=+364.972028123" lastFinishedPulling="2025-10-14 13:12:17.245236788 +0000 UTC m=+366.489395947" observedRunningTime="2025-10-14 13:12:17.786919109 +0000 UTC m=+367.031078288" watchObservedRunningTime="2025-10-14 13:12:17.787664512 +0000 UTC m=+367.031823671" Oct 14 13:12:17.896885 master-2 kubenswrapper[4762]: I1014 13:12:17.896807 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:17.896885 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:17.896885 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:17.896885 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:17.897221 master-2 kubenswrapper[4762]: I1014 13:12:17.896898 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:18.758472 master-2 kubenswrapper[4762]: I1014 13:12:18.758370 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2vwm" event={"ID":"b63c8f4e-054a-446c-8b1c-c9fec73416f9","Type":"ContainerStarted","Data":"4f08736305b39d1da880a73e99562380da701bd27f08205891cee2e3466f103a"} Oct 14 13:12:18.787224 master-2 kubenswrapper[4762]: I1014 13:12:18.787084 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m2vwm" podStartSLOduration=3.216056307 podStartE2EDuration="5.787052912s" podCreationTimestamp="2025-10-14 13:12:13 +0000 UTC" firstStartedPulling="2025-10-14 13:12:15.7277514 +0000 UTC m=+364.971910569" lastFinishedPulling="2025-10-14 13:12:18.298747975 +0000 UTC m=+367.542907174" observedRunningTime="2025-10-14 13:12:18.786614808 +0000 UTC m=+368.030774027" watchObservedRunningTime="2025-10-14 13:12:18.787052912 +0000 UTC m=+368.031212111" Oct 14 13:12:18.897186 master-2 kubenswrapper[4762]: I1014 13:12:18.897067 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:18.897186 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:18.897186 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:18.897186 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:18.897186 master-2 kubenswrapper[4762]: I1014 13:12:18.897140 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:19.897865 master-2 kubenswrapper[4762]: I1014 13:12:19.897783 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:19.897865 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:19.897865 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:19.897865 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:19.897865 master-2 kubenswrapper[4762]: I1014 13:12:19.897857 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:20.821850 master-2 kubenswrapper[4762]: I1014 13:12:20.821760 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:20.821850 master-2 kubenswrapper[4762]: I1014 13:12:20.821852 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:20.889445 master-2 kubenswrapper[4762]: I1014 13:12:20.889312 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:20.897928 master-2 kubenswrapper[4762]: I1014 13:12:20.897850 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:20.897928 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:20.897928 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:20.897928 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:20.898694 master-2 kubenswrapper[4762]: I1014 13:12:20.897940 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:21.022632 master-2 kubenswrapper[4762]: I1014 13:12:21.022528 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:21.022632 master-2 kubenswrapper[4762]: I1014 13:12:21.022620 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:21.077879 master-2 kubenswrapper[4762]: I1014 13:12:21.077732 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:21.816273 master-2 kubenswrapper[4762]: I1014 13:12:21.816225 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-629l7" Oct 14 13:12:21.817129 master-2 kubenswrapper[4762]: I1014 13:12:21.817075 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7flhc" Oct 14 13:12:21.898071 master-2 kubenswrapper[4762]: I1014 13:12:21.898001 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:21.898071 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:21.898071 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:21.898071 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:21.899053 master-2 kubenswrapper[4762]: I1014 13:12:21.899013 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:22.508695 master-2 kubenswrapper[4762]: I1014 13:12:22.508595 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-2" Oct 14 13:12:22.528028 master-2 kubenswrapper[4762]: I1014 13:12:22.527925 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-2" Oct 14 13:12:22.897822 master-2 kubenswrapper[4762]: I1014 13:12:22.897630 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:22.897822 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:22.897822 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:22.897822 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:22.897822 master-2 kubenswrapper[4762]: I1014 13:12:22.897706 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:23.664402 master-2 kubenswrapper[4762]: I1014 13:12:23.664333 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:23.664402 master-2 kubenswrapper[4762]: I1014 13:12:23.664415 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:23.704622 master-2 kubenswrapper[4762]: I1014 13:12:23.704543 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:23.810525 master-2 kubenswrapper[4762]: I1014 13:12:23.810457 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:23.810890 master-2 kubenswrapper[4762]: I1014 13:12:23.810766 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:23.852756 master-2 kubenswrapper[4762]: I1014 13:12:23.852709 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2p79c" Oct 14 13:12:23.898971 master-2 kubenswrapper[4762]: I1014 13:12:23.898922 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:23.898971 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:23.898971 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:23.898971 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:23.899655 master-2 kubenswrapper[4762]: I1014 13:12:23.899624 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:24.868016 master-2 kubenswrapper[4762]: I1014 13:12:24.867939 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m2vwm" podUID="b63c8f4e-054a-446c-8b1c-c9fec73416f9" containerName="registry-server" probeResult="failure" output=< Oct 14 13:12:24.868016 master-2 kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Oct 14 13:12:24.868016 master-2 kubenswrapper[4762]: > Oct 14 13:12:24.897509 master-2 kubenswrapper[4762]: I1014 13:12:24.897444 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:24.897509 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:24.897509 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:24.897509 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:24.897787 master-2 kubenswrapper[4762]: I1014 13:12:24.897533 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:25.897523 master-2 kubenswrapper[4762]: I1014 13:12:25.897475 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:25.897523 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:25.897523 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:25.897523 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:25.898450 master-2 kubenswrapper[4762]: I1014 13:12:25.898408 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:26.897811 master-2 kubenswrapper[4762]: I1014 13:12:26.897709 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:26.897811 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:26.897811 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:26.897811 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:26.897811 master-2 kubenswrapper[4762]: I1014 13:12:26.897780 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:27.897681 master-2 kubenswrapper[4762]: I1014 13:12:27.897585 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:27.897681 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:27.897681 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:27.897681 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:27.898612 master-2 kubenswrapper[4762]: I1014 13:12:27.897707 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:28.897765 master-2 kubenswrapper[4762]: I1014 13:12:28.897706 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:28.897765 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:28.897765 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:28.897765 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:28.898368 master-2 kubenswrapper[4762]: I1014 13:12:28.897790 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:29.896479 master-2 kubenswrapper[4762]: I1014 13:12:29.896393 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:29.896479 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:29.896479 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:29.896479 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:29.896479 master-2 kubenswrapper[4762]: I1014 13:12:29.896471 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:30.899212 master-2 kubenswrapper[4762]: I1014 13:12:30.899130 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:30.899212 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:30.899212 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:30.899212 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:30.900130 master-2 kubenswrapper[4762]: I1014 13:12:30.899331 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:31.897202 master-2 kubenswrapper[4762]: I1014 13:12:31.897121 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:31.897202 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:31.897202 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:31.897202 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:31.897716 master-2 kubenswrapper[4762]: I1014 13:12:31.897686 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:32.897558 master-2 kubenswrapper[4762]: I1014 13:12:32.897449 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:32.897558 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:32.897558 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:32.897558 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:32.897558 master-2 kubenswrapper[4762]: I1014 13:12:32.897543 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:33.880723 master-2 kubenswrapper[4762]: I1014 13:12:33.880651 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:33.899290 master-2 kubenswrapper[4762]: I1014 13:12:33.898510 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:33.899290 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:33.899290 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:33.899290 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:33.899290 master-2 kubenswrapper[4762]: I1014 13:12:33.898578 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:33.924048 master-2 kubenswrapper[4762]: I1014 13:12:33.923999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m2vwm" Oct 14 13:12:34.898331 master-2 kubenswrapper[4762]: I1014 13:12:34.898224 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:34.898331 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:34.898331 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:34.898331 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:34.898331 master-2 kubenswrapper[4762]: I1014 13:12:34.898307 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:35.477681 master-2 kubenswrapper[4762]: I1014 13:12:35.477579 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: E1014 13:12:35.477865 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="registry-server" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: I1014 13:12:35.477887 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="registry-server" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: E1014 13:12:35.477908 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="extract-utilities" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: I1014 13:12:35.477922 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="extract-utilities" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: E1014 13:12:35.477943 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="extract-content" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: I1014 13:12:35.477957 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="extract-content" Oct 14 13:12:35.478472 master-2 kubenswrapper[4762]: I1014 13:12:35.478103 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c551a119-e58d-46c3-9f81-7c0400c70c27" containerName="registry-server" Oct 14 13:12:35.479612 master-2 kubenswrapper[4762]: I1014 13:12:35.479558 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.525875 master-2 kubenswrapper[4762]: I1014 13:12:35.525724 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:12:35.659806 master-2 kubenswrapper[4762]: I1014 13:12:35.659664 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.660194 master-2 kubenswrapper[4762]: I1014 13:12:35.659943 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.762083 master-2 kubenswrapper[4762]: I1014 13:12:35.761845 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.762083 master-2 kubenswrapper[4762]: I1014 13:12:35.761982 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.762083 master-2 kubenswrapper[4762]: I1014 13:12:35.762042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.762549 master-2 kubenswrapper[4762]: I1014 13:12:35.762126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.821148 master-2 kubenswrapper[4762]: I1014 13:12:35.821015 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:35.846819 master-2 kubenswrapper[4762]: W1014 13:12:35.846722 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94dc80ceea2f1dff7b9e2ec4d1d6aead.slice/crio-8a0cc7a1e60434ea8a5ba6e8991270c99a53fd407f44577efc6c6487ad12f546 WatchSource:0}: Error finding container 8a0cc7a1e60434ea8a5ba6e8991270c99a53fd407f44577efc6c6487ad12f546: Status 404 returned error can't find the container with id 8a0cc7a1e60434ea8a5ba6e8991270c99a53fd407f44577efc6c6487ad12f546 Oct 14 13:12:35.852956 master-2 kubenswrapper[4762]: I1014 13:12:35.852887 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-2" event={"ID":"df774fb4-463b-43ef-bdff-0525e6ca4c1a","Type":"ContainerDied","Data":"9c906c3a11adeb37d19047c2bdc2a82a58c61581ea5806d44d5e372fbe0dc4ef"} Oct 14 13:12:35.853569 master-2 kubenswrapper[4762]: I1014 13:12:35.852831 4762 generic.go:334] "Generic (PLEG): container finished" podID="df774fb4-463b-43ef-bdff-0525e6ca4c1a" containerID="9c906c3a11adeb37d19047c2bdc2a82a58c61581ea5806d44d5e372fbe0dc4ef" exitCode=0 Oct 14 13:12:35.897788 master-2 kubenswrapper[4762]: I1014 13:12:35.897682 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:35.897788 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:35.897788 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:35.897788 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:35.897788 master-2 kubenswrapper[4762]: I1014 13:12:35.897767 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:36.860370 master-2 kubenswrapper[4762]: I1014 13:12:36.860254 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"8a0cc7a1e60434ea8a5ba6e8991270c99a53fd407f44577efc6c6487ad12f546"} Oct 14 13:12:36.897813 master-2 kubenswrapper[4762]: I1014 13:12:36.897706 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:36.897813 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:36.897813 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:36.897813 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:36.897813 master-2 kubenswrapper[4762]: I1014 13:12:36.897794 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:37.225114 master-2 kubenswrapper[4762]: I1014 13:12:37.225049 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:12:37.282610 master-2 kubenswrapper[4762]: I1014 13:12:37.280856 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock\") pod \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " Oct 14 13:12:37.282610 master-2 kubenswrapper[4762]: I1014 13:12:37.280926 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock" (OuterVolumeSpecName: "var-lock") pod "df774fb4-463b-43ef-bdff-0525e6ca4c1a" (UID: "df774fb4-463b-43ef-bdff-0525e6ca4c1a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:12:37.282610 master-2 kubenswrapper[4762]: I1014 13:12:37.281209 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access\") pod \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " Oct 14 13:12:37.282610 master-2 kubenswrapper[4762]: I1014 13:12:37.281244 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir\") pod \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\" (UID: \"df774fb4-463b-43ef-bdff-0525e6ca4c1a\") " Oct 14 13:12:37.282610 master-2 kubenswrapper[4762]: I1014 13:12:37.282511 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df774fb4-463b-43ef-bdff-0525e6ca4c1a" (UID: "df774fb4-463b-43ef-bdff-0525e6ca4c1a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:12:37.282988 master-2 kubenswrapper[4762]: I1014 13:12:37.282703 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:37.282988 master-2 kubenswrapper[4762]: I1014 13:12:37.282721 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/df774fb4-463b-43ef-bdff-0525e6ca4c1a-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:37.286082 master-2 kubenswrapper[4762]: I1014 13:12:37.286009 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df774fb4-463b-43ef-bdff-0525e6ca4c1a" (UID: "df774fb4-463b-43ef-bdff-0525e6ca4c1a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:37.384036 master-2 kubenswrapper[4762]: I1014 13:12:37.383982 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df774fb4-463b-43ef-bdff-0525e6ca4c1a-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:37.868367 master-2 kubenswrapper[4762]: I1014 13:12:37.868230 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-2" event={"ID":"df774fb4-463b-43ef-bdff-0525e6ca4c1a","Type":"ContainerDied","Data":"98c42eb613c3fff56176f4ee2e5cf5c4251c13cd317be48027a636a3ed9b48e8"} Oct 14 13:12:37.868367 master-2 kubenswrapper[4762]: I1014 13:12:37.868281 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98c42eb613c3fff56176f4ee2e5cf5c4251c13cd317be48027a636a3ed9b48e8" Oct 14 13:12:37.868367 master-2 kubenswrapper[4762]: I1014 13:12:37.868349 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-2" Oct 14 13:12:37.897220 master-2 kubenswrapper[4762]: I1014 13:12:37.897128 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:37.897220 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:37.897220 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:37.897220 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:37.897220 master-2 kubenswrapper[4762]: I1014 13:12:37.897204 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:38.897027 master-2 kubenswrapper[4762]: I1014 13:12:38.896955 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:38.897027 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:38.897027 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:38.897027 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:38.897027 master-2 kubenswrapper[4762]: I1014 13:12:38.897021 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:39.899686 master-2 kubenswrapper[4762]: I1014 13:12:39.899635 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:39.899686 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:39.899686 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:39.899686 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:39.900367 master-2 kubenswrapper[4762]: I1014 13:12:39.899708 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:40.101670 master-2 kubenswrapper[4762]: I1014 13:12:40.101576 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:12:40.897347 master-2 kubenswrapper[4762]: I1014 13:12:40.897279 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:40.897347 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:40.897347 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:40.897347 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:40.897347 master-2 kubenswrapper[4762]: I1014 13:12:40.897343 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:41.896641 master-2 kubenswrapper[4762]: I1014 13:12:41.896555 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:41.896641 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:41.896641 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:41.896641 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:41.897269 master-2 kubenswrapper[4762]: I1014 13:12:41.896658 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:42.897773 master-2 kubenswrapper[4762]: I1014 13:12:42.897713 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:42.897773 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:42.897773 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:42.897773 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:42.898410 master-2 kubenswrapper[4762]: I1014 13:12:42.897791 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:43.530278 master-2 kubenswrapper[4762]: I1014 13:12:43.530241 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-2"] Oct 14 13:12:43.530655 master-2 kubenswrapper[4762]: E1014 13:12:43.530640 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df774fb4-463b-43ef-bdff-0525e6ca4c1a" containerName="installer" Oct 14 13:12:43.530722 master-2 kubenswrapper[4762]: I1014 13:12:43.530713 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="df774fb4-463b-43ef-bdff-0525e6ca4c1a" containerName="installer" Oct 14 13:12:43.530856 master-2 kubenswrapper[4762]: I1014 13:12:43.530844 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="df774fb4-463b-43ef-bdff-0525e6ca4c1a" containerName="installer" Oct 14 13:12:43.531334 master-2 kubenswrapper[4762]: I1014 13:12:43.531319 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:43.534575 master-2 kubenswrapper[4762]: I1014 13:12:43.534513 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 14 13:12:43.534868 master-2 kubenswrapper[4762]: I1014 13:12:43.534810 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"openshift-service-ca.crt" Oct 14 13:12:43.535619 master-2 kubenswrapper[4762]: I1014 13:12:43.535581 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"default-dockercfg-vv6lx" Oct 14 13:12:43.543884 master-2 kubenswrapper[4762]: I1014 13:12:43.543834 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-2"] Oct 14 13:12:43.557270 master-2 kubenswrapper[4762]: I1014 13:12:43.557211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkxf\" (UniqueName: \"kubernetes.io/projected/d43a34b7-69dd-43b0-8465-4e44cb687285-kube-api-access-2dkxf\") pod \"kube-controller-manager-guard-master-2\" (UID: \"d43a34b7-69dd-43b0-8465-4e44cb687285\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:43.659008 master-2 kubenswrapper[4762]: I1014 13:12:43.658945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkxf\" (UniqueName: \"kubernetes.io/projected/d43a34b7-69dd-43b0-8465-4e44cb687285-kube-api-access-2dkxf\") pod \"kube-controller-manager-guard-master-2\" (UID: \"d43a34b7-69dd-43b0-8465-4e44cb687285\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:43.686199 master-2 kubenswrapper[4762]: I1014 13:12:43.684378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkxf\" (UniqueName: \"kubernetes.io/projected/d43a34b7-69dd-43b0-8465-4e44cb687285-kube-api-access-2dkxf\") pod \"kube-controller-manager-guard-master-2\" (UID: \"d43a34b7-69dd-43b0-8465-4e44cb687285\") " pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:43.846075 master-2 kubenswrapper[4762]: I1014 13:12:43.845966 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:43.896932 master-2 kubenswrapper[4762]: I1014 13:12:43.896872 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:43.896932 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:43.896932 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:43.896932 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:43.897458 master-2 kubenswrapper[4762]: I1014 13:12:43.897408 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:44.845642 master-2 kubenswrapper[4762]: I1014 13:12:44.845574 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-2"] Oct 14 13:12:44.854480 master-2 kubenswrapper[4762]: W1014 13:12:44.854411 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43a34b7_69dd_43b0_8465_4e44cb687285.slice/crio-3306e75a9248a22d324417cecc7f872870dcdec18d41033d29f084b37346a96b WatchSource:0}: Error finding container 3306e75a9248a22d324417cecc7f872870dcdec18d41033d29f084b37346a96b: Status 404 returned error can't find the container with id 3306e75a9248a22d324417cecc7f872870dcdec18d41033d29f084b37346a96b Oct 14 13:12:44.897109 master-2 kubenswrapper[4762]: I1014 13:12:44.897023 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:44.897109 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:44.897109 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:44.897109 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:44.897502 master-2 kubenswrapper[4762]: I1014 13:12:44.897113 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:44.927145 master-2 kubenswrapper[4762]: I1014 13:12:44.927014 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"651a658d252f21b182795b06bd916191c32cb19c07c2f8bd03e20093679aa253"} Oct 14 13:12:44.928545 master-2 kubenswrapper[4762]: I1014 13:12:44.928488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" event={"ID":"d43a34b7-69dd-43b0-8465-4e44cb687285","Type":"ContainerStarted","Data":"3306e75a9248a22d324417cecc7f872870dcdec18d41033d29f084b37346a96b"} Oct 14 13:12:45.897627 master-2 kubenswrapper[4762]: I1014 13:12:45.897541 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:45.897627 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:45.897627 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:45.897627 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:45.898692 master-2 kubenswrapper[4762]: I1014 13:12:45.897640 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:45.938627 master-2 kubenswrapper[4762]: I1014 13:12:45.938492 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" event={"ID":"d43a34b7-69dd-43b0-8465-4e44cb687285","Type":"ContainerStarted","Data":"25eb4676b82ec8561241b932ae81cf8433069d8dd05a7678b1b9ae00d61e9644"} Oct 14 13:12:45.939504 master-2 kubenswrapper[4762]: I1014 13:12:45.939409 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:45.948385 master-2 kubenswrapper[4762]: I1014 13:12:45.948334 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:12:45.959629 master-2 kubenswrapper[4762]: I1014 13:12:45.959515 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podStartSLOduration=2.95948858 podStartE2EDuration="2.95948858s" podCreationTimestamp="2025-10-14 13:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:45.957192636 +0000 UTC m=+395.201351865" watchObservedRunningTime="2025-10-14 13:12:45.95948858 +0000 UTC m=+395.203647769" Oct 14 13:12:46.896863 master-2 kubenswrapper[4762]: I1014 13:12:46.896781 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:46.896863 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:46.896863 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:46.896863 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:46.898574 master-2 kubenswrapper[4762]: I1014 13:12:46.896889 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:46.953986 master-2 kubenswrapper[4762]: I1014 13:12:46.953905 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"e45f2b094e6806403cfa8da2bf527b04ff4b8ae6e1a18580c31fcc2301b38ee9"} Oct 14 13:12:47.774043 master-2 kubenswrapper[4762]: I1014 13:12:47.773967 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:12:47.776398 master-2 kubenswrapper[4762]: I1014 13:12:47.776367 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.812692 master-2 kubenswrapper[4762]: I1014 13:12:47.812589 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.812889 master-2 kubenswrapper[4762]: I1014 13:12:47.812742 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.812889 master-2 kubenswrapper[4762]: I1014 13:12:47.812776 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.820362 master-2 kubenswrapper[4762]: I1014 13:12:47.820289 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:12:47.897965 master-2 kubenswrapper[4762]: I1014 13:12:47.897895 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:47.897965 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:47.897965 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:47.897965 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:47.898927 master-2 kubenswrapper[4762]: I1014 13:12:47.898005 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:47.913820 master-2 kubenswrapper[4762]: I1014 13:12:47.913764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.913940 master-2 kubenswrapper[4762]: I1014 13:12:47.913824 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.913940 master-2 kubenswrapper[4762]: I1014 13:12:47.913897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.914069 master-2 kubenswrapper[4762]: I1014 13:12:47.913968 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.914069 master-2 kubenswrapper[4762]: I1014 13:12:47.913994 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.914283 master-2 kubenswrapper[4762]: I1014 13:12:47.914077 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:47.963779 master-2 kubenswrapper[4762]: I1014 13:12:47.963713 4762 generic.go:334] "Generic (PLEG): container finished" podID="8d1479cd-b121-44d6-af25-3bc9b573c89f" containerID="97643b9390c296873cc51961bb7ec70a80fba35c5e5c0df557ef033fe557b704" exitCode=0 Oct 14 13:12:47.964089 master-2 kubenswrapper[4762]: I1014 13:12:47.963830 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-2" event={"ID":"8d1479cd-b121-44d6-af25-3bc9b573c89f","Type":"ContainerDied","Data":"97643b9390c296873cc51961bb7ec70a80fba35c5e5c0df557ef033fe557b704"} Oct 14 13:12:47.967505 master-2 kubenswrapper[4762]: I1014 13:12:47.967419 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"d7e365eb8e01212deb310e30a49986c32ca5d3a702e94995aab2751ca5e8f908"} Oct 14 13:12:47.967505 master-2 kubenswrapper[4762]: I1014 13:12:47.967506 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"67e3eecae682a65c0dea3a2495e130d1fb9f92e0a4de76a1793c299e38cffbf0"} Oct 14 13:12:48.034888 master-2 kubenswrapper[4762]: I1014 13:12:48.034637 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podStartSLOduration=13.034598633 podStartE2EDuration="13.034598633s" podCreationTimestamp="2025-10-14 13:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:48.031589365 +0000 UTC m=+397.275748554" watchObservedRunningTime="2025-10-14 13:12:48.034598633 +0000 UTC m=+397.278757822" Oct 14 13:12:48.118430 master-2 kubenswrapper[4762]: I1014 13:12:48.118301 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:48.155671 master-2 kubenswrapper[4762]: W1014 13:12:48.155558 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79535145b65a4e1d50292e1c2670257a.slice/crio-13167792d7a8798e7a7b33998ae62bc2985cf6b8e7600447bc0f7a7940b92848 WatchSource:0}: Error finding container 13167792d7a8798e7a7b33998ae62bc2985cf6b8e7600447bc0f7a7940b92848: Status 404 returned error can't find the container with id 13167792d7a8798e7a7b33998ae62bc2985cf6b8e7600447bc0f7a7940b92848 Oct 14 13:12:48.897529 master-2 kubenswrapper[4762]: I1014 13:12:48.897453 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:48.897529 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:48.897529 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:48.897529 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:48.897894 master-2 kubenswrapper[4762]: I1014 13:12:48.897568 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:48.974493 master-2 kubenswrapper[4762]: I1014 13:12:48.974436 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69" exitCode=0 Oct 14 13:12:48.975447 master-2 kubenswrapper[4762]: I1014 13:12:48.975337 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerDied","Data":"03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69"} Oct 14 13:12:48.975521 master-2 kubenswrapper[4762]: I1014 13:12:48.975448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"13167792d7a8798e7a7b33998ae62bc2985cf6b8e7600447bc0f7a7940b92848"} Oct 14 13:12:49.269902 master-2 kubenswrapper[4762]: I1014 13:12:49.269843 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:49.428603 master-2 kubenswrapper[4762]: I1014 13:12:49.428397 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock\") pod \"8d1479cd-b121-44d6-af25-3bc9b573c89f\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " Oct 14 13:12:49.428603 master-2 kubenswrapper[4762]: I1014 13:12:49.428502 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir\") pod \"8d1479cd-b121-44d6-af25-3bc9b573c89f\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " Oct 14 13:12:49.428603 master-2 kubenswrapper[4762]: I1014 13:12:49.428570 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access\") pod \"8d1479cd-b121-44d6-af25-3bc9b573c89f\" (UID: \"8d1479cd-b121-44d6-af25-3bc9b573c89f\") " Oct 14 13:12:49.429373 master-2 kubenswrapper[4762]: I1014 13:12:49.429318 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock" (OuterVolumeSpecName: "var-lock") pod "8d1479cd-b121-44d6-af25-3bc9b573c89f" (UID: "8d1479cd-b121-44d6-af25-3bc9b573c89f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:12:49.429512 master-2 kubenswrapper[4762]: I1014 13:12:49.429390 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d1479cd-b121-44d6-af25-3bc9b573c89f" (UID: "8d1479cd-b121-44d6-af25-3bc9b573c89f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:12:49.433781 master-2 kubenswrapper[4762]: I1014 13:12:49.433734 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d1479cd-b121-44d6-af25-3bc9b573c89f" (UID: "8d1479cd-b121-44d6-af25-3bc9b573c89f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:12:49.530575 master-2 kubenswrapper[4762]: I1014 13:12:49.530216 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:49.530575 master-2 kubenswrapper[4762]: I1014 13:12:49.530263 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d1479cd-b121-44d6-af25-3bc9b573c89f-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:49.530575 master-2 kubenswrapper[4762]: I1014 13:12:49.530277 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d1479cd-b121-44d6-af25-3bc9b573c89f-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:12:49.897409 master-2 kubenswrapper[4762]: I1014 13:12:49.897324 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:49.897409 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:49.897409 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:49.897409 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:49.897727 master-2 kubenswrapper[4762]: I1014 13:12:49.897447 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.986676 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2"} Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.986728 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12"} Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.986746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba"} Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.989924 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-2" event={"ID":"8d1479cd-b121-44d6-af25-3bc9b573c89f","Type":"ContainerDied","Data":"8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29"} Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.989949 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bec906cc285513f26ebd343f5fc00b8dcc58d2e1ae705f4bb443cc919ad4b29" Oct 14 13:12:49.990241 master-2 kubenswrapper[4762]: I1014 13:12:49.990030 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-2" Oct 14 13:12:50.534480 master-2 kubenswrapper[4762]: I1014 13:12:50.534381 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-guard-master-2"] Oct 14 13:12:50.897388 master-2 kubenswrapper[4762]: I1014 13:12:50.897237 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:50.897388 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:50.897388 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:50.897388 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:50.897388 master-2 kubenswrapper[4762]: I1014 13:12:50.897316 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:51.000183 master-2 kubenswrapper[4762]: I1014 13:12:51.000056 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc"} Oct 14 13:12:51.000183 master-2 kubenswrapper[4762]: I1014 13:12:51.000136 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"79535145b65a4e1d50292e1c2670257a","Type":"ContainerStarted","Data":"d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8"} Oct 14 13:12:51.001046 master-2 kubenswrapper[4762]: I1014 13:12:51.000292 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:51.030304 master-2 kubenswrapper[4762]: I1014 13:12:51.029874 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-2" podStartSLOduration=4.029847846 podStartE2EDuration="4.029847846s" podCreationTimestamp="2025-10-14 13:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:12:51.02811701 +0000 UTC m=+400.272276169" watchObservedRunningTime="2025-10-14 13:12:51.029847846 +0000 UTC m=+400.274007015" Oct 14 13:12:51.897038 master-2 kubenswrapper[4762]: I1014 13:12:51.896966 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:51.897038 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:51.897038 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:51.897038 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:51.897394 master-2 kubenswrapper[4762]: I1014 13:12:51.897053 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:52.897114 master-2 kubenswrapper[4762]: I1014 13:12:52.897055 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:52.897114 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:52.897114 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:52.897114 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:52.898271 master-2 kubenswrapper[4762]: I1014 13:12:52.897142 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:53.119198 master-2 kubenswrapper[4762]: I1014 13:12:53.119078 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:53.119513 master-2 kubenswrapper[4762]: I1014 13:12:53.119232 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:53.130240 master-2 kubenswrapper[4762]: I1014 13:12:53.130199 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:53.224093 master-2 kubenswrapper[4762]: I1014 13:12:53.223950 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:12:53.224766 master-2 kubenswrapper[4762]: I1014 13:12:53.224706 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" containerID="cri-o://5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af" gracePeriod=120 Oct 14 13:12:53.224975 master-2 kubenswrapper[4762]: I1014 13:12:53.224931 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499" gracePeriod=120 Oct 14 13:12:53.897525 master-2 kubenswrapper[4762]: I1014 13:12:53.897447 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:53.897525 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:53.897525 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:53.897525 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:53.898574 master-2 kubenswrapper[4762]: I1014 13:12:53.897533 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:54.025833 master-2 kubenswrapper[4762]: I1014 13:12:54.025764 4762 generic.go:334] "Generic (PLEG): container finished" podID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerID="3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499" exitCode=0 Oct 14 13:12:54.026082 master-2 kubenswrapper[4762]: I1014 13:12:54.025911 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerDied","Data":"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499"} Oct 14 13:12:54.031986 master-2 kubenswrapper[4762]: I1014 13:12:54.031888 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:12:54.896920 master-2 kubenswrapper[4762]: I1014 13:12:54.896786 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:54.896920 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:54.896920 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:54.896920 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:54.896920 master-2 kubenswrapper[4762]: I1014 13:12:54.896883 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: I1014 13:12:54.983552 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:12:54.983661 master-2 kubenswrapper[4762]: I1014 13:12:54.983630 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:55.822324 master-2 kubenswrapper[4762]: I1014 13:12:55.822227 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.822567 master-2 kubenswrapper[4762]: I1014 13:12:55.822340 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.822567 master-2 kubenswrapper[4762]: I1014 13:12:55.822366 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.822567 master-2 kubenswrapper[4762]: I1014 13:12:55.822391 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.828706 master-2 kubenswrapper[4762]: I1014 13:12:55.828661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.829312 master-2 kubenswrapper[4762]: I1014 13:12:55.829284 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:55.897888 master-2 kubenswrapper[4762]: I1014 13:12:55.897781 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:55.897888 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:55.897888 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:55.897888 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:55.898254 master-2 kubenswrapper[4762]: I1014 13:12:55.897917 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:56.043955 master-2 kubenswrapper[4762]: I1014 13:12:56.043869 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:56.044819 master-2 kubenswrapper[4762]: I1014 13:12:56.044656 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:12:56.898656 master-2 kubenswrapper[4762]: I1014 13:12:56.898550 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:56.898656 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:56.898656 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:56.898656 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:56.898656 master-2 kubenswrapper[4762]: I1014 13:12:56.898631 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:57.897547 master-2 kubenswrapper[4762]: I1014 13:12:57.897477 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:57.897547 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:57.897547 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:57.897547 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:57.898119 master-2 kubenswrapper[4762]: I1014 13:12:57.897572 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:58.520819 master-2 kubenswrapper[4762]: I1014 13:12:58.520737 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-2"] Oct 14 13:12:58.521808 master-2 kubenswrapper[4762]: E1014 13:12:58.521767 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1479cd-b121-44d6-af25-3bc9b573c89f" containerName="installer" Oct 14 13:12:58.522010 master-2 kubenswrapper[4762]: I1014 13:12:58.521984 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1479cd-b121-44d6-af25-3bc9b573c89f" containerName="installer" Oct 14 13:12:58.522423 master-2 kubenswrapper[4762]: I1014 13:12:58.522388 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1479cd-b121-44d6-af25-3bc9b573c89f" containerName="installer" Oct 14 13:12:58.523394 master-2 kubenswrapper[4762]: I1014 13:12:58.523354 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:12:58.527227 master-2 kubenswrapper[4762]: I1014 13:12:58.527141 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"openshift-service-ca.crt" Oct 14 13:12:58.527469 master-2 kubenswrapper[4762]: I1014 13:12:58.527321 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"default-dockercfg-pd7qg" Oct 14 13:12:58.527599 master-2 kubenswrapper[4762]: I1014 13:12:58.527486 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 14 13:12:58.533532 master-2 kubenswrapper[4762]: I1014 13:12:58.533455 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-2"] Oct 14 13:12:58.660855 master-2 kubenswrapper[4762]: I1014 13:12:58.660793 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k6cb\" (UniqueName: \"kubernetes.io/projected/a1d6199c-769e-4363-8439-75d433c50528-kube-api-access-8k6cb\") pod \"kube-apiserver-guard-master-2\" (UID: \"a1d6199c-769e-4363-8439-75d433c50528\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:12:58.761841 master-2 kubenswrapper[4762]: I1014 13:12:58.761785 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k6cb\" (UniqueName: \"kubernetes.io/projected/a1d6199c-769e-4363-8439-75d433c50528-kube-api-access-8k6cb\") pod \"kube-apiserver-guard-master-2\" (UID: \"a1d6199c-769e-4363-8439-75d433c50528\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:12:58.794057 master-2 kubenswrapper[4762]: I1014 13:12:58.793906 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k6cb\" (UniqueName: \"kubernetes.io/projected/a1d6199c-769e-4363-8439-75d433c50528-kube-api-access-8k6cb\") pod \"kube-apiserver-guard-master-2\" (UID: \"a1d6199c-769e-4363-8439-75d433c50528\") " pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:12:58.851937 master-2 kubenswrapper[4762]: I1014 13:12:58.851883 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:12:58.897763 master-2 kubenswrapper[4762]: I1014 13:12:58.897691 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:58.897763 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:58.897763 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:58.897763 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:58.898279 master-2 kubenswrapper[4762]: I1014 13:12:58.897801 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:59.294762 master-2 kubenswrapper[4762]: I1014 13:12:59.294699 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-2"] Oct 14 13:12:59.299990 master-2 kubenswrapper[4762]: W1014 13:12:59.299925 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d6199c_769e_4363_8439_75d433c50528.slice/crio-42d422a5df206ec9af76e537c119ac982500054313bd9a18b5b7f9168025e604 WatchSource:0}: Error finding container 42d422a5df206ec9af76e537c119ac982500054313bd9a18b5b7f9168025e604: Status 404 returned error can't find the container with id 42d422a5df206ec9af76e537c119ac982500054313bd9a18b5b7f9168025e604 Oct 14 13:12:59.799632 master-2 kubenswrapper[4762]: I1014 13:12:59.799543 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-2"] Oct 14 13:12:59.800616 master-2 kubenswrapper[4762]: I1014 13:12:59.800569 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:12:59.804527 master-2 kubenswrapper[4762]: I1014 13:12:59.804062 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Oct 14 13:12:59.804527 master-2 kubenswrapper[4762]: I1014 13:12:59.804148 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bm6wx" Oct 14 13:12:59.811110 master-2 kubenswrapper[4762]: I1014 13:12:59.811044 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-2"] Oct 14 13:12:59.898518 master-2 kubenswrapper[4762]: I1014 13:12:59.898364 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:12:59.898518 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:12:59.898518 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:12:59.898518 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:12:59.898518 master-2 kubenswrapper[4762]: I1014 13:12:59.898466 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:12:59.977598 master-2 kubenswrapper[4762]: I1014 13:12:59.977507 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:12:59.977598 master-2 kubenswrapper[4762]: I1014 13:12:59.977592 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:12:59.977966 master-2 kubenswrapper[4762]: I1014 13:12:59.977922 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: I1014 13:12:59.982204 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:12:59.982292 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:12:59.983680 master-2 kubenswrapper[4762]: I1014 13:12:59.982329 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:00.073510 master-2 kubenswrapper[4762]: I1014 13:13:00.073422 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" event={"ID":"a1d6199c-769e-4363-8439-75d433c50528","Type":"ContainerStarted","Data":"7e0f26480c8259a3d5d450067a5a4921e396fac543722157e39f3b52c8b0c8db"} Oct 14 13:13:00.073510 master-2 kubenswrapper[4762]: I1014 13:13:00.073495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" event={"ID":"a1d6199c-769e-4363-8439-75d433c50528","Type":"ContainerStarted","Data":"42d422a5df206ec9af76e537c119ac982500054313bd9a18b5b7f9168025e604"} Oct 14 13:13:00.073910 master-2 kubenswrapper[4762]: I1014 13:13:00.073837 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:13:00.079187 master-2 kubenswrapper[4762]: I1014 13:13:00.079080 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.079282 master-2 kubenswrapper[4762]: I1014 13:13:00.079221 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.079363 master-2 kubenswrapper[4762]: I1014 13:13:00.079228 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.079524 master-2 kubenswrapper[4762]: I1014 13:13:00.079462 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.079702 master-2 kubenswrapper[4762]: I1014 13:13:00.079637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.082119 master-2 kubenswrapper[4762]: I1014 13:13:00.082062 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:13:00.099452 master-2 kubenswrapper[4762]: I1014 13:13:00.099350 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podStartSLOduration=2.099321739 podStartE2EDuration="2.099321739s" podCreationTimestamp="2025-10-14 13:12:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:00.09471768 +0000 UTC m=+409.338876879" watchObservedRunningTime="2025-10-14 13:13:00.099321739 +0000 UTC m=+409.343480898" Oct 14 13:13:00.101836 master-2 kubenswrapper[4762]: I1014 13:13:00.101772 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access\") pod \"installer-5-master-2\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.124919 master-2 kubenswrapper[4762]: I1014 13:13:00.124864 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:00.615886 master-2 kubenswrapper[4762]: I1014 13:13:00.615831 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-2"] Oct 14 13:13:00.620820 master-2 kubenswrapper[4762]: W1014 13:13:00.620766 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb849490b_3d5b_41e5_90b7_7ad12ab8182f.slice/crio-f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404 WatchSource:0}: Error finding container f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404: Status 404 returned error can't find the container with id f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404 Oct 14 13:13:00.898355 master-2 kubenswrapper[4762]: I1014 13:13:00.898116 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:00.898355 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:00.898355 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:00.898355 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:00.898355 master-2 kubenswrapper[4762]: I1014 13:13:00.898253 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:01.082625 master-2 kubenswrapper[4762]: I1014 13:13:01.082519 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-2" event={"ID":"b849490b-3d5b-41e5-90b7-7ad12ab8182f","Type":"ContainerStarted","Data":"f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404"} Oct 14 13:13:01.899461 master-2 kubenswrapper[4762]: I1014 13:13:01.898736 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:01.899461 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:01.899461 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:01.899461 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:01.899461 master-2 kubenswrapper[4762]: I1014 13:13:01.898827 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:02.725430 master-2 kubenswrapper[4762]: I1014 13:13:02.725335 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-2"] Oct 14 13:13:02.726370 master-2 kubenswrapper[4762]: I1014 13:13:02.726302 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.729345 master-2 kubenswrapper[4762]: I1014 13:13:02.729295 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-p7d8w" Oct 14 13:13:02.735344 master-2 kubenswrapper[4762]: I1014 13:13:02.735275 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-2"] Oct 14 13:13:02.811734 master-2 kubenswrapper[4762]: I1014 13:13:02.811674 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.811888 master-2 kubenswrapper[4762]: I1014 13:13:02.811769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.811888 master-2 kubenswrapper[4762]: I1014 13:13:02.811828 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.896439 master-2 kubenswrapper[4762]: I1014 13:13:02.896370 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:02.896439 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:02.896439 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:02.896439 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:02.896439 master-2 kubenswrapper[4762]: I1014 13:13:02.896432 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:02.913181 master-2 kubenswrapper[4762]: I1014 13:13:02.913109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.913821 master-2 kubenswrapper[4762]: I1014 13:13:02.913191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.913821 master-2 kubenswrapper[4762]: I1014 13:13:02.913267 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.913821 master-2 kubenswrapper[4762]: I1014 13:13:02.913286 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.913821 master-2 kubenswrapper[4762]: I1014 13:13:02.913505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:02.945013 master-2 kubenswrapper[4762]: I1014 13:13:02.944926 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access\") pod \"installer-2-master-2\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:03.052242 master-2 kubenswrapper[4762]: I1014 13:13:03.052023 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:03.095357 master-2 kubenswrapper[4762]: I1014 13:13:03.095283 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-2" event={"ID":"b849490b-3d5b-41e5-90b7-7ad12ab8182f","Type":"ContainerStarted","Data":"97ba3a7b61aa0fcca55607f6bf9c5e80a06bdddde15e3219943b19c39a37b460"} Oct 14 13:13:03.123302 master-2 kubenswrapper[4762]: I1014 13:13:03.123181 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-2" podStartSLOduration=2.320326213 podStartE2EDuration="4.123139925s" podCreationTimestamp="2025-10-14 13:12:59 +0000 UTC" firstStartedPulling="2025-10-14 13:13:00.623677069 +0000 UTC m=+409.867836248" lastFinishedPulling="2025-10-14 13:13:02.426490791 +0000 UTC m=+411.670649960" observedRunningTime="2025-10-14 13:13:03.121022766 +0000 UTC m=+412.365182005" watchObservedRunningTime="2025-10-14 13:13:03.123139925 +0000 UTC m=+412.367299104" Oct 14 13:13:03.497957 master-2 kubenswrapper[4762]: I1014 13:13:03.497894 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-2"] Oct 14 13:13:03.503342 master-2 kubenswrapper[4762]: W1014 13:13:03.503281 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode13f3e45_aacc_4bcb_b326_7ea636019144.slice/crio-92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3 WatchSource:0}: Error finding container 92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3: Status 404 returned error can't find the container with id 92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3 Oct 14 13:13:03.896746 master-2 kubenswrapper[4762]: I1014 13:13:03.896704 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:03.896746 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:03.896746 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:03.896746 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:03.897191 master-2 kubenswrapper[4762]: I1014 13:13:03.896758 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:04.102915 master-2 kubenswrapper[4762]: I1014 13:13:04.102805 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-2" event={"ID":"e13f3e45-aacc-4bcb-b326-7ea636019144","Type":"ContainerStarted","Data":"471dd3f7e32698b467618a79e5a9208a0a0694b26a61ecac0d985b8a48506bbe"} Oct 14 13:13:04.103596 master-2 kubenswrapper[4762]: I1014 13:13:04.103530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-2" event={"ID":"e13f3e45-aacc-4bcb-b326-7ea636019144","Type":"ContainerStarted","Data":"92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3"} Oct 14 13:13:04.126641 master-2 kubenswrapper[4762]: I1014 13:13:04.126530 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-2" podStartSLOduration=2.126505642 podStartE2EDuration="2.126505642s" podCreationTimestamp="2025-10-14 13:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:04.124310522 +0000 UTC m=+413.368469741" watchObservedRunningTime="2025-10-14 13:13:04.126505642 +0000 UTC m=+413.370664801" Oct 14 13:13:04.898344 master-2 kubenswrapper[4762]: I1014 13:13:04.898259 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:04.898344 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:04.898344 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:04.898344 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:04.898781 master-2 kubenswrapper[4762]: I1014 13:13:04.898360 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: I1014 13:13:04.985187 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:04.985230 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:04.986076 master-2 kubenswrapper[4762]: I1014 13:13:04.985971 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:04.986139 master-2 kubenswrapper[4762]: I1014 13:13:04.986095 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:13:05.897762 master-2 kubenswrapper[4762]: I1014 13:13:05.897636 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:05.897762 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:05.897762 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:05.897762 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:05.898809 master-2 kubenswrapper[4762]: I1014 13:13:05.897763 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:06.325803 master-2 kubenswrapper[4762]: I1014 13:13:06.325700 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-guard-master-2"] Oct 14 13:13:06.897198 master-2 kubenswrapper[4762]: I1014 13:13:06.897063 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:06.897198 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:06.897198 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:06.897198 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:06.897683 master-2 kubenswrapper[4762]: I1014 13:13:06.897218 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:07.898358 master-2 kubenswrapper[4762]: I1014 13:13:07.898249 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:07.898358 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:07.898358 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:07.898358 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:07.899315 master-2 kubenswrapper[4762]: I1014 13:13:07.898354 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:08.126282 master-2 kubenswrapper[4762]: I1014 13:13:08.126208 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:08.897835 master-2 kubenswrapper[4762]: I1014 13:13:08.897689 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:08.897835 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:08.897835 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:08.897835 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:08.897835 master-2 kubenswrapper[4762]: I1014 13:13:08.897830 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:09.897059 master-2 kubenswrapper[4762]: I1014 13:13:09.896971 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:09.897059 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:09.897059 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:09.897059 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:09.897059 master-2 kubenswrapper[4762]: I1014 13:13:09.897030 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: I1014 13:13:09.982396 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:09.982475 master-2 kubenswrapper[4762]: I1014 13:13:09.982471 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:10.110634 master-2 kubenswrapper[4762]: I1014 13:13:10.110534 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:13:10.897268 master-2 kubenswrapper[4762]: I1014 13:13:10.897183 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:10.897268 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:10.897268 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:10.897268 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:10.897268 master-2 kubenswrapper[4762]: I1014 13:13:10.897259 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:11.897097 master-2 kubenswrapper[4762]: I1014 13:13:11.896896 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:11.897097 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:11.897097 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:11.897097 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:11.897097 master-2 kubenswrapper[4762]: I1014 13:13:11.896997 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:12.896824 master-2 kubenswrapper[4762]: I1014 13:13:12.896705 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:12.896824 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:12.896824 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:12.896824 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:12.897617 master-2 kubenswrapper[4762]: I1014 13:13:12.896876 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:13.897658 master-2 kubenswrapper[4762]: I1014 13:13:13.897559 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:13.897658 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:13.897658 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:13.897658 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:13.897658 master-2 kubenswrapper[4762]: I1014 13:13:13.897654 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:14.897586 master-2 kubenswrapper[4762]: I1014 13:13:14.897513 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:14.897586 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:14.897586 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:14.897586 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:14.898662 master-2 kubenswrapper[4762]: I1014 13:13:14.898150 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: I1014 13:13:14.986645 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:14.986746 master-2 kubenswrapper[4762]: I1014 13:13:14.986739 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:15.898323 master-2 kubenswrapper[4762]: I1014 13:13:15.898213 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:15.898323 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:15.898323 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:15.898323 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:15.898323 master-2 kubenswrapper[4762]: I1014 13:13:15.898308 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:16.897370 master-2 kubenswrapper[4762]: I1014 13:13:16.897315 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:16.897370 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:16.897370 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:16.897370 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:16.898410 master-2 kubenswrapper[4762]: I1014 13:13:16.897395 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:17.898764 master-2 kubenswrapper[4762]: I1014 13:13:17.898701 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:17.898764 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:17.898764 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:17.898764 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:17.899928 master-2 kubenswrapper[4762]: I1014 13:13:17.899427 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:18.897592 master-2 kubenswrapper[4762]: I1014 13:13:18.897514 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:18.897592 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:18.897592 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:18.897592 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:18.898005 master-2 kubenswrapper[4762]: I1014 13:13:18.897599 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:19.898811 master-2 kubenswrapper[4762]: I1014 13:13:19.898671 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:19.898811 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:19.898811 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:19.898811 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:19.898811 master-2 kubenswrapper[4762]: I1014 13:13:19.898801 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: I1014 13:13:19.983295 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:19.983415 master-2 kubenswrapper[4762]: I1014 13:13:19.983393 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:20.897841 master-2 kubenswrapper[4762]: I1014 13:13:20.897725 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:20.897841 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:20.897841 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:20.897841 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:20.898346 master-2 kubenswrapper[4762]: I1014 13:13:20.897881 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:21.897198 master-2 kubenswrapper[4762]: I1014 13:13:21.897073 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:21.897198 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:21.897198 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:21.897198 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:21.898525 master-2 kubenswrapper[4762]: I1014 13:13:21.897206 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:22.898953 master-2 kubenswrapper[4762]: I1014 13:13:22.898804 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:22.898953 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:22.898953 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:22.898953 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:22.900039 master-2 kubenswrapper[4762]: I1014 13:13:22.899221 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:23.898020 master-2 kubenswrapper[4762]: I1014 13:13:23.897874 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:23.898020 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:23.898020 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:23.898020 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:23.898020 master-2 kubenswrapper[4762]: I1014 13:13:23.897968 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:24.898238 master-2 kubenswrapper[4762]: I1014 13:13:24.898129 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:24.898238 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:24.898238 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:24.898238 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:24.899214 master-2 kubenswrapper[4762]: I1014 13:13:24.898256 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: I1014 13:13:24.985138 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:24.985278 master-2 kubenswrapper[4762]: I1014 13:13:24.985269 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:25.514837 master-2 kubenswrapper[4762]: I1014 13:13:25.514727 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:13:25.515769 master-2 kubenswrapper[4762]: I1014 13:13:25.515227 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" containerID="cri-o://2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385" gracePeriod=120 Oct 14 13:13:25.898836 master-2 kubenswrapper[4762]: I1014 13:13:25.898675 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:25.898836 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:25.898836 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:25.898836 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:25.898836 master-2 kubenswrapper[4762]: I1014 13:13:25.898767 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:26.897000 master-2 kubenswrapper[4762]: I1014 13:13:26.896910 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:26.897000 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:26.897000 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:26.897000 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:26.897495 master-2 kubenswrapper[4762]: I1014 13:13:26.896997 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:27.898469 master-2 kubenswrapper[4762]: I1014 13:13:27.898413 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:27.898469 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:27.898469 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:27.898469 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:27.898469 master-2 kubenswrapper[4762]: I1014 13:13:27.898482 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:28.897851 master-2 kubenswrapper[4762]: I1014 13:13:28.897758 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:28.897851 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:28.897851 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:28.897851 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:28.898308 master-2 kubenswrapper[4762]: I1014 13:13:28.897888 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: I1014 13:13:28.970036 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:28.970123 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:28.971207 master-2 kubenswrapper[4762]: I1014 13:13:28.970130 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:29.897380 master-2 kubenswrapper[4762]: I1014 13:13:29.897303 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:29.897380 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:29.897380 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:29.897380 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:29.898053 master-2 kubenswrapper[4762]: I1014 13:13:29.897400 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: I1014 13:13:29.984353 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:29.984459 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:29.986448 master-2 kubenswrapper[4762]: I1014 13:13:29.984507 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:30.897637 master-2 kubenswrapper[4762]: I1014 13:13:30.897567 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:30.897637 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:30.897637 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:30.897637 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:30.898196 master-2 kubenswrapper[4762]: I1014 13:13:30.897644 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:31.898837 master-2 kubenswrapper[4762]: I1014 13:13:31.898745 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:31.898837 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:31.898837 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:31.898837 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:31.899793 master-2 kubenswrapper[4762]: I1014 13:13:31.898845 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:32.897829 master-2 kubenswrapper[4762]: I1014 13:13:32.897744 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:32.897829 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:32.897829 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:32.897829 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:32.897829 master-2 kubenswrapper[4762]: I1014 13:13:32.897830 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:33.897574 master-2 kubenswrapper[4762]: I1014 13:13:33.897445 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:33.897574 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:33.897574 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:33.897574 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:33.898768 master-2 kubenswrapper[4762]: I1014 13:13:33.897585 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: I1014 13:13:33.968770 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:33.968855 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:33.969722 master-2 kubenswrapper[4762]: I1014 13:13:33.968864 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:34.210921 master-2 kubenswrapper[4762]: I1014 13:13:34.210840 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:13:34.212636 master-2 kubenswrapper[4762]: I1014 13:13:34.212605 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.258130 master-2 kubenswrapper[4762]: I1014 13:13:34.258026 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:13:34.337127 master-2 kubenswrapper[4762]: I1014 13:13:34.337051 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.337127 master-2 kubenswrapper[4762]: I1014 13:13:34.337109 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.439261 master-2 kubenswrapper[4762]: I1014 13:13:34.439197 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.439450 master-2 kubenswrapper[4762]: I1014 13:13:34.439271 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.439524 master-2 kubenswrapper[4762]: I1014 13:13:34.439450 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.439754 master-2 kubenswrapper[4762]: I1014 13:13:34.439673 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.554836 master-2 kubenswrapper[4762]: I1014 13:13:34.554646 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:34.583480 master-2 kubenswrapper[4762]: W1014 13:13:34.583405 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26cf13b1c8c4f1b57c0ac506ef256a4.slice/crio-8e592b1bf9f432b9d2baa935fba33de5c17bffc7cb259c1dcdc83e0eb734bd8a WatchSource:0}: Error finding container 8e592b1bf9f432b9d2baa935fba33de5c17bffc7cb259c1dcdc83e0eb734bd8a: Status 404 returned error can't find the container with id 8e592b1bf9f432b9d2baa935fba33de5c17bffc7cb259c1dcdc83e0eb734bd8a Oct 14 13:13:34.896389 master-2 kubenswrapper[4762]: I1014 13:13:34.896294 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:34.896389 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:34.896389 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:34.896389 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:34.896884 master-2 kubenswrapper[4762]: I1014 13:13:34.896407 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: I1014 13:13:34.982077 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:34.982189 master-2 kubenswrapper[4762]: I1014 13:13:34.982180 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:35.320025 master-2 kubenswrapper[4762]: I1014 13:13:35.319948 4762 generic.go:334] "Generic (PLEG): container finished" podID="b849490b-3d5b-41e5-90b7-7ad12ab8182f" containerID="97ba3a7b61aa0fcca55607f6bf9c5e80a06bdddde15e3219943b19c39a37b460" exitCode=0 Oct 14 13:13:35.320394 master-2 kubenswrapper[4762]: I1014 13:13:35.320069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-2" event={"ID":"b849490b-3d5b-41e5-90b7-7ad12ab8182f","Type":"ContainerDied","Data":"97ba3a7b61aa0fcca55607f6bf9c5e80a06bdddde15e3219943b19c39a37b460"} Oct 14 13:13:35.322568 master-2 kubenswrapper[4762]: I1014 13:13:35.322505 4762 generic.go:334] "Generic (PLEG): container finished" podID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerID="8fcbe4e7b616f9e023c4c6ab447171662a89a0f5f8d78ea5663544b61a533bff" exitCode=0 Oct 14 13:13:35.322704 master-2 kubenswrapper[4762]: I1014 13:13:35.322580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"f26cf13b1c8c4f1b57c0ac506ef256a4","Type":"ContainerDied","Data":"8fcbe4e7b616f9e023c4c6ab447171662a89a0f5f8d78ea5663544b61a533bff"} Oct 14 13:13:35.322704 master-2 kubenswrapper[4762]: I1014 13:13:35.322658 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"f26cf13b1c8c4f1b57c0ac506ef256a4","Type":"ContainerStarted","Data":"8e592b1bf9f432b9d2baa935fba33de5c17bffc7cb259c1dcdc83e0eb734bd8a"} Oct 14 13:13:35.897268 master-2 kubenswrapper[4762]: I1014 13:13:35.897214 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:35.897268 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:35.897268 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:35.897268 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:35.897458 master-2 kubenswrapper[4762]: I1014 13:13:35.897277 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:36.345870 master-2 kubenswrapper[4762]: I1014 13:13:36.345772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"f26cf13b1c8c4f1b57c0ac506ef256a4","Type":"ContainerStarted","Data":"2e6e16be70c882739cd9ce4e47fdaafa968150fcb22d8ff2377d2049b8b3beef"} Oct 14 13:13:36.346432 master-2 kubenswrapper[4762]: I1014 13:13:36.345890 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"f26cf13b1c8c4f1b57c0ac506ef256a4","Type":"ContainerStarted","Data":"be6e28bd40aab1a8ebbfd27ebfe4eb591ba0b55eae92170c1b9f8a0a8a41306c"} Oct 14 13:13:36.346432 master-2 kubenswrapper[4762]: I1014 13:13:36.345945 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"f26cf13b1c8c4f1b57c0ac506ef256a4","Type":"ContainerStarted","Data":"0da1f39434f36e2d44c9684fdf09ed6e485b933407a1476b3fb79bad430550c2"} Oct 14 13:13:36.346432 master-2 kubenswrapper[4762]: I1014 13:13:36.345999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:13:36.375237 master-2 kubenswrapper[4762]: I1014 13:13:36.374674 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podStartSLOduration=2.374649158 podStartE2EDuration="2.374649158s" podCreationTimestamp="2025-10-14 13:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:36.369086139 +0000 UTC m=+445.613245338" watchObservedRunningTime="2025-10-14 13:13:36.374649158 +0000 UTC m=+445.618808327" Oct 14 13:13:36.670062 master-2 kubenswrapper[4762]: I1014 13:13:36.670003 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:36.764118 master-2 kubenswrapper[4762]: I1014 13:13:36.764029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock\") pod \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " Oct 14 13:13:36.764118 master-2 kubenswrapper[4762]: I1014 13:13:36.764111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir\") pod \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " Oct 14 13:13:36.764429 master-2 kubenswrapper[4762]: I1014 13:13:36.764236 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access\") pod \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\" (UID: \"b849490b-3d5b-41e5-90b7-7ad12ab8182f\") " Oct 14 13:13:36.764429 master-2 kubenswrapper[4762]: I1014 13:13:36.764292 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock" (OuterVolumeSpecName: "var-lock") pod "b849490b-3d5b-41e5-90b7-7ad12ab8182f" (UID: "b849490b-3d5b-41e5-90b7-7ad12ab8182f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:36.764429 master-2 kubenswrapper[4762]: I1014 13:13:36.764337 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b849490b-3d5b-41e5-90b7-7ad12ab8182f" (UID: "b849490b-3d5b-41e5-90b7-7ad12ab8182f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:36.764770 master-2 kubenswrapper[4762]: I1014 13:13:36.764724 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:36.764770 master-2 kubenswrapper[4762]: I1014 13:13:36.764765 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:36.767821 master-2 kubenswrapper[4762]: I1014 13:13:36.767754 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b849490b-3d5b-41e5-90b7-7ad12ab8182f" (UID: "b849490b-3d5b-41e5-90b7-7ad12ab8182f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:36.865774 master-2 kubenswrapper[4762]: I1014 13:13:36.865716 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b849490b-3d5b-41e5-90b7-7ad12ab8182f-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:36.897142 master-2 kubenswrapper[4762]: I1014 13:13:36.896939 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:36.897142 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:36.897142 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:36.897142 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:36.897142 master-2 kubenswrapper[4762]: I1014 13:13:36.897024 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:37.353831 master-2 kubenswrapper[4762]: I1014 13:13:37.353732 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-2" Oct 14 13:13:37.353831 master-2 kubenswrapper[4762]: I1014 13:13:37.353735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-2" event={"ID":"b849490b-3d5b-41e5-90b7-7ad12ab8182f","Type":"ContainerDied","Data":"f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404"} Oct 14 13:13:37.353831 master-2 kubenswrapper[4762]: I1014 13:13:37.353818 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2aa8db0a1c5c8e0b729c36258bdaf67a236794589436d269b88306db4bc6404" Oct 14 13:13:37.898373 master-2 kubenswrapper[4762]: I1014 13:13:37.898281 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:37.898373 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:37.898373 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:37.898373 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:37.898895 master-2 kubenswrapper[4762]: I1014 13:13:37.898391 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:38.898097 master-2 kubenswrapper[4762]: I1014 13:13:38.897999 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:38.898097 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:38.898097 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:38.898097 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:38.899023 master-2 kubenswrapper[4762]: I1014 13:13:38.898120 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: I1014 13:13:38.967129 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:38.967267 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:38.968555 master-2 kubenswrapper[4762]: I1014 13:13:38.967290 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:38.968555 master-2 kubenswrapper[4762]: I1014 13:13:38.967442 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:13:39.898206 master-2 kubenswrapper[4762]: I1014 13:13:39.898110 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:39.898206 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:39.898206 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:39.898206 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:39.899259 master-2 kubenswrapper[4762]: I1014 13:13:39.898227 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: I1014 13:13:39.982798 4762 patch_prober.go:28] interesting pod/apiserver-8644c46667-cg62m container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:39.982936 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:39.985218 master-2 kubenswrapper[4762]: I1014 13:13:39.982930 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-8644c46667-cg62m" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:40.121909 master-2 kubenswrapper[4762]: I1014 13:13:40.121764 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:13:40.899266 master-2 kubenswrapper[4762]: I1014 13:13:40.899193 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:40.899266 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:40.899266 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:40.899266 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:40.900400 master-2 kubenswrapper[4762]: I1014 13:13:40.899272 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:41.269655 master-2 kubenswrapper[4762]: I1014 13:13:41.269540 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2"] Oct 14 13:13:41.270191 master-2 kubenswrapper[4762]: E1014 13:13:41.269819 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b849490b-3d5b-41e5-90b7-7ad12ab8182f" containerName="installer" Oct 14 13:13:41.270191 master-2 kubenswrapper[4762]: I1014 13:13:41.269841 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b849490b-3d5b-41e5-90b7-7ad12ab8182f" containerName="installer" Oct 14 13:13:41.270191 master-2 kubenswrapper[4762]: I1014 13:13:41.270004 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b849490b-3d5b-41e5-90b7-7ad12ab8182f" containerName="installer" Oct 14 13:13:41.271253 master-2 kubenswrapper[4762]: I1014 13:13:41.271088 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:41.276950 master-2 kubenswrapper[4762]: I1014 13:13:41.275557 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"default-dockercfg-kszsm" Oct 14 13:13:41.276950 master-2 kubenswrapper[4762]: I1014 13:13:41.275945 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Oct 14 13:13:41.277087 master-2 kubenswrapper[4762]: I1014 13:13:41.276980 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"openshift-service-ca.crt" Oct 14 13:13:41.285804 master-2 kubenswrapper[4762]: I1014 13:13:41.285722 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2"] Oct 14 13:13:41.430777 master-2 kubenswrapper[4762]: I1014 13:13:41.430680 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf86j\" (UniqueName: \"kubernetes.io/projected/da145675-d789-46f4-8036-694602b5efd6-kube-api-access-jf86j\") pod \"openshift-kube-scheduler-guard-master-2\" (UID: \"da145675-d789-46f4-8036-694602b5efd6\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:41.531887 master-2 kubenswrapper[4762]: I1014 13:13:41.531737 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf86j\" (UniqueName: \"kubernetes.io/projected/da145675-d789-46f4-8036-694602b5efd6-kube-api-access-jf86j\") pod \"openshift-kube-scheduler-guard-master-2\" (UID: \"da145675-d789-46f4-8036-694602b5efd6\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:41.561190 master-2 kubenswrapper[4762]: I1014 13:13:41.561123 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf86j\" (UniqueName: \"kubernetes.io/projected/da145675-d789-46f4-8036-694602b5efd6-kube-api-access-jf86j\") pod \"openshift-kube-scheduler-guard-master-2\" (UID: \"da145675-d789-46f4-8036-694602b5efd6\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:41.604021 master-2 kubenswrapper[4762]: I1014 13:13:41.603952 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:41.898152 master-2 kubenswrapper[4762]: I1014 13:13:41.897552 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:41.898152 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:41.898152 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:41.898152 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:41.898152 master-2 kubenswrapper[4762]: I1014 13:13:41.897657 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:41.993281 master-2 kubenswrapper[4762]: I1014 13:13:41.993212 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:13:41.994149 master-2 kubenswrapper[4762]: I1014 13:13:41.993766 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc" gracePeriod=135 Oct 14 13:13:41.994149 master-2 kubenswrapper[4762]: I1014 13:13:41.993878 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2" gracePeriod=135 Oct 14 13:13:41.994149 master-2 kubenswrapper[4762]: I1014 13:13:41.994071 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8" gracePeriod=135 Oct 14 13:13:41.994492 master-2 kubenswrapper[4762]: I1014 13:13:41.994275 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12" gracePeriod=135 Oct 14 13:13:41.994492 master-2 kubenswrapper[4762]: I1014 13:13:41.994387 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver" containerID="cri-o://a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba" gracePeriod=135 Oct 14 13:13:41.994832 master-2 kubenswrapper[4762]: I1014 13:13:41.994793 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995120 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995141 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995186 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995200 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995213 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-check-endpoints" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995254 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-check-endpoints" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995275 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-syncer" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995285 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-syncer" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995301 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="setup" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995310 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="setup" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: E1014 13:13:41.995328 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-insecure-readyz" Oct 14 13:13:41.995385 master-2 kubenswrapper[4762]: I1014 13:13:41.995339 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-insecure-readyz" Oct 14 13:13:41.996411 master-2 kubenswrapper[4762]: I1014 13:13:41.995459 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-check-endpoints" Oct 14 13:13:41.996411 master-2 kubenswrapper[4762]: I1014 13:13:41.995484 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-insecure-readyz" Oct 14 13:13:41.996411 master-2 kubenswrapper[4762]: I1014 13:13:41.995500 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver" Oct 14 13:13:41.996411 master-2 kubenswrapper[4762]: I1014 13:13:41.995512 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-syncer" Oct 14 13:13:41.996411 master-2 kubenswrapper[4762]: I1014 13:13:41.995527 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79535145b65a4e1d50292e1c2670257a" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:13:42.057516 master-2 kubenswrapper[4762]: I1014 13:13:42.057442 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2"] Oct 14 13:13:42.066177 master-2 kubenswrapper[4762]: W1014 13:13:42.066097 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda145675_d789_46f4_8036_694602b5efd6.slice/crio-527131f5e9eb7dd84405d98235f03e5ac18cbfe32ada96894d4dd7435fb69e1f WatchSource:0}: Error finding container 527131f5e9eb7dd84405d98235f03e5ac18cbfe32ada96894d4dd7435fb69e1f: Status 404 returned error can't find the container with id 527131f5e9eb7dd84405d98235f03e5ac18cbfe32ada96894d4dd7435fb69e1f Oct 14 13:13:42.139299 master-2 kubenswrapper[4762]: I1014 13:13:42.138669 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.139299 master-2 kubenswrapper[4762]: I1014 13:13:42.138738 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.139299 master-2 kubenswrapper[4762]: I1014 13:13:42.138768 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.239784 master-2 kubenswrapper[4762]: I1014 13:13:42.239715 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.239948 master-2 kubenswrapper[4762]: I1014 13:13:42.239801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.239948 master-2 kubenswrapper[4762]: I1014 13:13:42.239910 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.239948 master-2 kubenswrapper[4762]: I1014 13:13:42.239919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.240186 master-2 kubenswrapper[4762]: I1014 13:13:42.240104 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.240246 master-2 kubenswrapper[4762]: I1014 13:13:42.240108 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:13:42.390070 master-2 kubenswrapper[4762]: I1014 13:13:42.389981 4762 generic.go:334] "Generic (PLEG): container finished" podID="e13f3e45-aacc-4bcb-b326-7ea636019144" containerID="471dd3f7e32698b467618a79e5a9208a0a0694b26a61ecac0d985b8a48506bbe" exitCode=0 Oct 14 13:13:42.390356 master-2 kubenswrapper[4762]: I1014 13:13:42.390102 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-2" event={"ID":"e13f3e45-aacc-4bcb-b326-7ea636019144","Type":"ContainerDied","Data":"471dd3f7e32698b467618a79e5a9208a0a0694b26a61ecac0d985b8a48506bbe"} Oct 14 13:13:42.392831 master-2 kubenswrapper[4762]: I1014 13:13:42.392737 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" event={"ID":"da145675-d789-46f4-8036-694602b5efd6","Type":"ContainerStarted","Data":"81054e9f22eb62a4e63312e4c7c39c57b16eb8b9772cfa6db838e3d78a177f82"} Oct 14 13:13:42.392831 master-2 kubenswrapper[4762]: I1014 13:13:42.392793 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" event={"ID":"da145675-d789-46f4-8036-694602b5efd6","Type":"ContainerStarted","Data":"527131f5e9eb7dd84405d98235f03e5ac18cbfe32ada96894d4dd7435fb69e1f"} Oct 14 13:13:42.393118 master-2 kubenswrapper[4762]: I1014 13:13:42.393074 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:42.398730 master-2 kubenswrapper[4762]: I1014 13:13:42.398690 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_79535145b65a4e1d50292e1c2670257a/kube-apiserver-cert-syncer/0.log" Oct 14 13:13:42.398878 master-2 kubenswrapper[4762]: I1014 13:13:42.398723 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:13:42.400052 master-2 kubenswrapper[4762]: I1014 13:13:42.400011 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc" exitCode=0 Oct 14 13:13:42.400216 master-2 kubenswrapper[4762]: I1014 13:13:42.400054 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8" exitCode=0 Oct 14 13:13:42.400216 master-2 kubenswrapper[4762]: I1014 13:13:42.400079 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2" exitCode=0 Oct 14 13:13:42.400216 master-2 kubenswrapper[4762]: I1014 13:13:42.400096 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12" exitCode=2 Oct 14 13:13:42.416340 master-2 kubenswrapper[4762]: I1014 13:13:42.416139 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="79535145b65a4e1d50292e1c2670257a" podUID="9041570beb5002e8da158e70e12f0c16" Oct 14 13:13:42.440202 master-2 kubenswrapper[4762]: I1014 13:13:42.439680 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podStartSLOduration=1.439652931 podStartE2EDuration="1.439652931s" podCreationTimestamp="2025-10-14 13:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:42.438773382 +0000 UTC m=+451.682932571" watchObservedRunningTime="2025-10-14 13:13:42.439652931 +0000 UTC m=+451.683812130" Oct 14 13:13:42.898076 master-2 kubenswrapper[4762]: I1014 13:13:42.897852 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:42.898076 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:42.898076 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:42.898076 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:42.898076 master-2 kubenswrapper[4762]: I1014 13:13:42.898021 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:43.800256 master-2 kubenswrapper[4762]: I1014 13:13:43.800194 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: I1014 13:13:43.860240 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:43.860289 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:43.862221 master-2 kubenswrapper[4762]: I1014 13:13:43.862189 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:43.897761 master-2 kubenswrapper[4762]: I1014 13:13:43.897693 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:43.897761 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:43.897761 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:43.897761 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:43.898138 master-2 kubenswrapper[4762]: I1014 13:13:43.897780 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:43.962556 master-2 kubenswrapper[4762]: I1014 13:13:43.962486 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access\") pod \"e13f3e45-aacc-4bcb-b326-7ea636019144\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " Oct 14 13:13:43.962972 master-2 kubenswrapper[4762]: I1014 13:13:43.962695 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock\") pod \"e13f3e45-aacc-4bcb-b326-7ea636019144\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " Oct 14 13:13:43.962972 master-2 kubenswrapper[4762]: I1014 13:13:43.962775 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir\") pod \"e13f3e45-aacc-4bcb-b326-7ea636019144\" (UID: \"e13f3e45-aacc-4bcb-b326-7ea636019144\") " Oct 14 13:13:43.962972 master-2 kubenswrapper[4762]: I1014 13:13:43.962866 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock" (OuterVolumeSpecName: "var-lock") pod "e13f3e45-aacc-4bcb-b326-7ea636019144" (UID: "e13f3e45-aacc-4bcb-b326-7ea636019144"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:43.963483 master-2 kubenswrapper[4762]: I1014 13:13:43.963025 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e13f3e45-aacc-4bcb-b326-7ea636019144" (UID: "e13f3e45-aacc-4bcb-b326-7ea636019144"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:43.963483 master-2 kubenswrapper[4762]: I1014 13:13:43.963147 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:43.963483 master-2 kubenswrapper[4762]: I1014 13:13:43.963209 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e13f3e45-aacc-4bcb-b326-7ea636019144-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: I1014 13:13:43.967070 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:43.967135 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:43.968201 master-2 kubenswrapper[4762]: I1014 13:13:43.967141 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:43.968581 master-2 kubenswrapper[4762]: I1014 13:13:43.968470 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e13f3e45-aacc-4bcb-b326-7ea636019144" (UID: "e13f3e45-aacc-4bcb-b326-7ea636019144"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:44.064821 master-2 kubenswrapper[4762]: I1014 13:13:44.064563 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e13f3e45-aacc-4bcb-b326-7ea636019144-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.416594 master-2 kubenswrapper[4762]: I1014 13:13:44.416530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-2" event={"ID":"e13f3e45-aacc-4bcb-b326-7ea636019144","Type":"ContainerDied","Data":"92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3"} Oct 14 13:13:44.416762 master-2 kubenswrapper[4762]: I1014 13:13:44.416595 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92bf8e0a95827d6d3b6a0cd1fa95e008f9fba1c31bf6bb60161fb040e9781fe3" Oct 14 13:13:44.416762 master-2 kubenswrapper[4762]: I1014 13:13:44.416601 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-2" Oct 14 13:13:44.753670 master-2 kubenswrapper[4762]: I1014 13:13:44.753595 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:13:44.874923 master-2 kubenswrapper[4762]: I1014 13:13:44.874824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.874923 master-2 kubenswrapper[4762]: I1014 13:13:44.874917 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.874971 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875029 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875081 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875126 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875227 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875282 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhtgk\" (UniqueName: \"kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875330 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875373 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875431 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets\") pod \"c6c635b4-3d81-46f5-8f71-18a213b49c55\" (UID: \"c6c635b4-3d81-46f5-8f71-18a213b49c55\") " Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875493 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875589 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit" (OuterVolumeSpecName: "audit") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:44.875659 master-2 kubenswrapper[4762]: I1014 13:13:44.875592 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:13:44.876097 master-2 kubenswrapper[4762]: I1014 13:13:44.875932 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.876097 master-2 kubenswrapper[4762]: I1014 13:13:44.875975 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.876097 master-2 kubenswrapper[4762]: I1014 13:13:44.876005 4762 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c6c635b4-3d81-46f5-8f71-18a213b49c55-node-pullsecrets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.876097 master-2 kubenswrapper[4762]: I1014 13:13:44.875932 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config" (OuterVolumeSpecName: "config") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:44.876868 master-2 kubenswrapper[4762]: I1014 13:13:44.876673 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:44.877219 master-2 kubenswrapper[4762]: I1014 13:13:44.876884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:44.877545 master-2 kubenswrapper[4762]: I1014 13:13:44.877474 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:13:44.879901 master-2 kubenswrapper[4762]: I1014 13:13:44.879851 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:13:44.880574 master-2 kubenswrapper[4762]: I1014 13:13:44.880481 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk" (OuterVolumeSpecName: "kube-api-access-mhtgk") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "kube-api-access-mhtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:13:44.881282 master-2 kubenswrapper[4762]: I1014 13:13:44.881229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:13:44.883007 master-2 kubenswrapper[4762]: I1014 13:13:44.882933 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "c6c635b4-3d81-46f5-8f71-18a213b49c55" (UID: "c6c635b4-3d81-46f5-8f71-18a213b49c55"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:13:44.897918 master-2 kubenswrapper[4762]: I1014 13:13:44.897848 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:44.897918 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:44.897918 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:44.897918 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:44.898215 master-2 kubenswrapper[4762]: I1014 13:13:44.897951 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977345 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhtgk\" (UniqueName: \"kubernetes.io/projected/c6c635b4-3d81-46f5-8f71-18a213b49c55-kube-api-access-mhtgk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977402 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977432 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977456 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977478 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-image-import-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977502 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977523 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6c635b4-3d81-46f5-8f71-18a213b49c55-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:44.977537 master-2 kubenswrapper[4762]: I1014 13:13:44.977544 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c635b4-3d81-46f5-8f71-18a213b49c55-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:13:45.427665 master-2 kubenswrapper[4762]: I1014 13:13:45.427609 4762 generic.go:334] "Generic (PLEG): container finished" podID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerID="5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af" exitCode=0 Oct 14 13:13:45.427665 master-2 kubenswrapper[4762]: I1014 13:13:45.427669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerDied","Data":"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af"} Oct 14 13:13:45.428125 master-2 kubenswrapper[4762]: I1014 13:13:45.427692 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8644c46667-cg62m" Oct 14 13:13:45.428125 master-2 kubenswrapper[4762]: I1014 13:13:45.427713 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8644c46667-cg62m" event={"ID":"c6c635b4-3d81-46f5-8f71-18a213b49c55","Type":"ContainerDied","Data":"39ae4750077f04a8863b9c67437d8ec63dca0f595765405514e0bf82ac1eb173"} Oct 14 13:13:45.428125 master-2 kubenswrapper[4762]: I1014 13:13:45.427737 4762 scope.go:117] "RemoveContainer" containerID="3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499" Oct 14 13:13:45.446858 master-2 kubenswrapper[4762]: I1014 13:13:45.446814 4762 scope.go:117] "RemoveContainer" containerID="5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af" Oct 14 13:13:45.471356 master-2 kubenswrapper[4762]: I1014 13:13:45.471239 4762 scope.go:117] "RemoveContainer" containerID="23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa" Oct 14 13:13:45.483291 master-2 kubenswrapper[4762]: I1014 13:13:45.483229 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:13:45.489215 master-2 kubenswrapper[4762]: I1014 13:13:45.489124 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-8644c46667-cg62m"] Oct 14 13:13:45.497695 master-2 kubenswrapper[4762]: I1014 13:13:45.497640 4762 scope.go:117] "RemoveContainer" containerID="3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499" Oct 14 13:13:45.498346 master-2 kubenswrapper[4762]: E1014 13:13:45.498303 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499\": container with ID starting with 3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499 not found: ID does not exist" containerID="3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499" Oct 14 13:13:45.498395 master-2 kubenswrapper[4762]: I1014 13:13:45.498361 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499"} err="failed to get container status \"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499\": rpc error: code = NotFound desc = could not find container \"3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499\": container with ID starting with 3f4f509d7db8c2114d17e53c43465caae0d96295f9b964e656edd9cd443a5499 not found: ID does not exist" Oct 14 13:13:45.498445 master-2 kubenswrapper[4762]: I1014 13:13:45.498395 4762 scope.go:117] "RemoveContainer" containerID="5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af" Oct 14 13:13:45.499006 master-2 kubenswrapper[4762]: E1014 13:13:45.498949 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af\": container with ID starting with 5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af not found: ID does not exist" containerID="5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af" Oct 14 13:13:45.499067 master-2 kubenswrapper[4762]: I1014 13:13:45.499008 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af"} err="failed to get container status \"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af\": rpc error: code = NotFound desc = could not find container \"5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af\": container with ID starting with 5aba9dc8bc563f731a16be5c5ffc8a7c6a6c894b81a9fdd43f997659876c56af not found: ID does not exist" Oct 14 13:13:45.499067 master-2 kubenswrapper[4762]: I1014 13:13:45.499043 4762 scope.go:117] "RemoveContainer" containerID="23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa" Oct 14 13:13:45.499501 master-2 kubenswrapper[4762]: E1014 13:13:45.499444 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa\": container with ID starting with 23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa not found: ID does not exist" containerID="23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa" Oct 14 13:13:45.499561 master-2 kubenswrapper[4762]: I1014 13:13:45.499498 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa"} err="failed to get container status \"23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa\": rpc error: code = NotFound desc = could not find container \"23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa\": container with ID starting with 23203ddb0a417acd84efc10e7b9f7baea6a844897d4ddd13d93194cd6740acaa not found: ID does not exist" Oct 14 13:13:45.560767 master-2 kubenswrapper[4762]: I1014 13:13:45.560648 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" path="/var/lib/kubelet/pods/c6c635b4-3d81-46f5-8f71-18a213b49c55/volumes" Oct 14 13:13:45.896150 master-2 kubenswrapper[4762]: I1014 13:13:45.895975 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:45.896150 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:45.896150 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:45.896150 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:45.896150 master-2 kubenswrapper[4762]: I1014 13:13:45.896045 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:46.670079 master-2 kubenswrapper[4762]: I1014 13:13:46.670009 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2"] Oct 14 13:13:46.750580 master-2 kubenswrapper[4762]: I1014 13:13:46.750500 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: E1014 13:13:46.750704 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="fix-audit-permissions" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750720 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="fix-audit-permissions" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: E1014 13:13:46.750729 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13f3e45-aacc-4bcb-b326-7ea636019144" containerName="installer" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750737 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13f3e45-aacc-4bcb-b326-7ea636019144" containerName="installer" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: E1014 13:13:46.750748 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver-check-endpoints" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750756 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver-check-endpoints" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: E1014 13:13:46.750765 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750771 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750840 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver-check-endpoints" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750853 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13f3e45-aacc-4bcb-b326-7ea636019144" containerName="installer" Oct 14 13:13:46.750852 master-2 kubenswrapper[4762]: I1014 13:13:46.750863 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c635b4-3d81-46f5-8f71-18a213b49c55" containerName="openshift-apiserver" Oct 14 13:13:46.758293 master-2 kubenswrapper[4762]: I1014 13:13:46.758218 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.764146 master-2 kubenswrapper[4762]: I1014 13:13:46.764079 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:13:46.764344 master-2 kubenswrapper[4762]: I1014 13:13:46.764189 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:13:46.764540 master-2 kubenswrapper[4762]: I1014 13:13:46.764493 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:13:46.764699 master-2 kubenswrapper[4762]: I1014 13:13:46.764606 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:13:46.764973 master-2 kubenswrapper[4762]: I1014 13:13:46.764912 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:13:46.765332 master-2 kubenswrapper[4762]: I1014 13:13:46.765025 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:13:46.767065 master-2 kubenswrapper[4762]: I1014 13:13:46.766764 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:13:46.767065 master-2 kubenswrapper[4762]: I1014 13:13:46.766773 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:13:46.767065 master-2 kubenswrapper[4762]: I1014 13:13:46.766873 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-95k8q" Oct 14 13:13:46.767346 master-2 kubenswrapper[4762]: I1014 13:13:46.767274 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 14 13:13:46.769194 master-2 kubenswrapper[4762]: I1014 13:13:46.769122 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:13:46.775303 master-2 kubenswrapper[4762]: I1014 13:13:46.775175 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:13:46.799774 master-2 kubenswrapper[4762]: I1014 13:13:46.799695 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.799774 master-2 kubenswrapper[4762]: I1014 13:13:46.799765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.799808 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.799860 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.799879 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.800015 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjv6\" (UniqueName: \"kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.800057 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800126 master-2 kubenswrapper[4762]: I1014 13:13:46.800083 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800347 master-2 kubenswrapper[4762]: I1014 13:13:46.800183 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800347 master-2 kubenswrapper[4762]: I1014 13:13:46.800229 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.800347 master-2 kubenswrapper[4762]: I1014 13:13:46.800273 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.898337 master-2 kubenswrapper[4762]: I1014 13:13:46.898233 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:46.898337 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:46.898337 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:46.898337 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:46.899252 master-2 kubenswrapper[4762]: I1014 13:13:46.898353 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:46.901045 master-2 kubenswrapper[4762]: I1014 13:13:46.900968 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901120 master-2 kubenswrapper[4762]: I1014 13:13:46.901060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901120 master-2 kubenswrapper[4762]: I1014 13:13:46.901108 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901249 master-2 kubenswrapper[4762]: I1014 13:13:46.901189 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901249 master-2 kubenswrapper[4762]: I1014 13:13:46.901226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901320 master-2 kubenswrapper[4762]: I1014 13:13:46.901268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901354 master-2 kubenswrapper[4762]: I1014 13:13:46.901310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901387 master-2 kubenswrapper[4762]: I1014 13:13:46.901357 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901601 master-2 kubenswrapper[4762]: I1014 13:13:46.901554 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjv6\" (UniqueName: \"kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901640 master-2 kubenswrapper[4762]: I1014 13:13:46.901612 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901675 master-2 kubenswrapper[4762]: I1014 13:13:46.901649 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.901781 master-2 kubenswrapper[4762]: I1014 13:13:46.901650 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.902194 master-2 kubenswrapper[4762]: I1014 13:13:46.902113 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.902522 master-2 kubenswrapper[4762]: I1014 13:13:46.902474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.902696 master-2 kubenswrapper[4762]: I1014 13:13:46.902622 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.903339 master-2 kubenswrapper[4762]: I1014 13:13:46.903288 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.903556 master-2 kubenswrapper[4762]: I1014 13:13:46.903511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.903871 master-2 kubenswrapper[4762]: I1014 13:13:46.903815 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.904936 master-2 kubenswrapper[4762]: I1014 13:13:46.904887 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.906717 master-2 kubenswrapper[4762]: I1014 13:13:46.906662 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.906995 master-2 kubenswrapper[4762]: I1014 13:13:46.906939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:46.924775 master-2 kubenswrapper[4762]: I1014 13:13:46.924585 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjv6\" (UniqueName: \"kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6\") pod \"apiserver-595d5f74d8-ttb94\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:47.079475 master-2 kubenswrapper[4762]: I1014 13:13:47.079397 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:47.543474 master-2 kubenswrapper[4762]: I1014 13:13:47.543416 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:13:47.549204 master-2 kubenswrapper[4762]: W1014 13:13:47.549131 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e55f97_d971_46dd_b6b2_cdab1dc766df.slice/crio-62be16fcf39ee2382a9e4c402c51b828b8370c4a3b58b7817b8a36cb87501988 WatchSource:0}: Error finding container 62be16fcf39ee2382a9e4c402c51b828b8370c4a3b58b7817b8a36cb87501988: Status 404 returned error can't find the container with id 62be16fcf39ee2382a9e4c402c51b828b8370c4a3b58b7817b8a36cb87501988 Oct 14 13:13:47.898384 master-2 kubenswrapper[4762]: I1014 13:13:47.898312 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:47.898384 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:47.898384 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:47.898384 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:47.899603 master-2 kubenswrapper[4762]: I1014 13:13:47.898429 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:48.453520 master-2 kubenswrapper[4762]: I1014 13:13:48.453449 4762 generic.go:334] "Generic (PLEG): container finished" podID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerID="38275d8be284d8541f9671a3e61d6f7cd701a29cf8a0e5b5642bff4e6f23d6c1" exitCode=0 Oct 14 13:13:48.453841 master-2 kubenswrapper[4762]: I1014 13:13:48.453543 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerDied","Data":"38275d8be284d8541f9671a3e61d6f7cd701a29cf8a0e5b5642bff4e6f23d6c1"} Oct 14 13:13:48.453841 master-2 kubenswrapper[4762]: I1014 13:13:48.453625 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerStarted","Data":"62be16fcf39ee2382a9e4c402c51b828b8370c4a3b58b7817b8a36cb87501988"} Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: I1014 13:13:48.858350 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:48.858415 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:48.859671 master-2 kubenswrapper[4762]: I1014 13:13:48.858437 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:48.899609 master-2 kubenswrapper[4762]: I1014 13:13:48.899553 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:48.899609 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:48.899609 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:48.899609 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:48.899609 master-2 kubenswrapper[4762]: I1014 13:13:48.899610 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: I1014 13:13:48.967114 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:48.967191 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:48.967780 master-2 kubenswrapper[4762]: I1014 13:13:48.967224 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:49.460184 master-2 kubenswrapper[4762]: I1014 13:13:49.460133 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerStarted","Data":"475c7fe94c7689d429a499bdcf69b6cc227826fdedac1115b9592805f384a109"} Oct 14 13:13:49.460184 master-2 kubenswrapper[4762]: I1014 13:13:49.460185 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerStarted","Data":"eb59a26421ed95409972305df3c5daa73b20d2bede001f3c9ed71c3c125f3dc5"} Oct 14 13:13:49.897645 master-2 kubenswrapper[4762]: I1014 13:13:49.897499 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:49.897645 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:49.897645 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:49.897645 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:49.897645 master-2 kubenswrapper[4762]: I1014 13:13:49.897575 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:50.898707 master-2 kubenswrapper[4762]: I1014 13:13:50.898602 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:50.898707 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:50.898707 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:50.898707 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:50.899881 master-2 kubenswrapper[4762]: I1014 13:13:50.898737 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:51.898008 master-2 kubenswrapper[4762]: I1014 13:13:51.897951 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:51.898008 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:51.898008 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:51.898008 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:51.898776 master-2 kubenswrapper[4762]: I1014 13:13:51.898730 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:52.080075 master-2 kubenswrapper[4762]: I1014 13:13:52.079957 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:52.080075 master-2 kubenswrapper[4762]: I1014 13:13:52.080041 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:52.091052 master-2 kubenswrapper[4762]: I1014 13:13:52.090949 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:52.123072 master-2 kubenswrapper[4762]: I1014 13:13:52.122897 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podStartSLOduration=59.122816933 podStartE2EDuration="59.122816933s" podCreationTimestamp="2025-10-14 13:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:13:49.494624987 +0000 UTC m=+458.738784166" watchObservedRunningTime="2025-10-14 13:13:52.122816933 +0000 UTC m=+461.366976162" Oct 14 13:13:52.486377 master-2 kubenswrapper[4762]: I1014 13:13:52.486316 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:13:52.897945 master-2 kubenswrapper[4762]: I1014 13:13:52.897774 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:52.897945 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:52.897945 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:52.897945 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:52.897945 master-2 kubenswrapper[4762]: I1014 13:13:52.897858 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: I1014 13:13:53.861532 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:53.861612 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:53.865892 master-2 kubenswrapper[4762]: I1014 13:13:53.861614 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:53.865892 master-2 kubenswrapper[4762]: I1014 13:13:53.861724 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: I1014 13:13:53.867849 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:53.867907 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:53.869255 master-2 kubenswrapper[4762]: I1014 13:13:53.867929 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:53.897268 master-2 kubenswrapper[4762]: I1014 13:13:53.897210 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:53.897268 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:53.897268 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:53.897268 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:53.897527 master-2 kubenswrapper[4762]: I1014 13:13:53.897286 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: I1014 13:13:53.969110 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:53.969217 master-2 kubenswrapper[4762]: I1014 13:13:53.969210 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:54.897854 master-2 kubenswrapper[4762]: I1014 13:13:54.897781 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:54.897854 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:54.897854 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:54.897854 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:54.898910 master-2 kubenswrapper[4762]: I1014 13:13:54.897877 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:55.897447 master-2 kubenswrapper[4762]: I1014 13:13:55.897380 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:55.897447 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:55.897447 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:55.897447 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:55.897857 master-2 kubenswrapper[4762]: I1014 13:13:55.897479 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:56.897709 master-2 kubenswrapper[4762]: I1014 13:13:56.897579 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:56.897709 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:56.897709 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:56.897709 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:56.898735 master-2 kubenswrapper[4762]: I1014 13:13:56.897708 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:57.896908 master-2 kubenswrapper[4762]: I1014 13:13:57.896851 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:57.896908 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:57.896908 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:57.896908 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:57.896908 master-2 kubenswrapper[4762]: I1014 13:13:57.896935 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: I1014 13:13:58.857052 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:58.857118 master-2 kubenswrapper[4762]: I1014 13:13:58.857110 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:58.896901 master-2 kubenswrapper[4762]: I1014 13:13:58.896800 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:58.896901 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:58.896901 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:58.896901 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:58.896901 master-2 kubenswrapper[4762]: I1014 13:13:58.896863 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: I1014 13:13:58.965998 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:13:58.966104 master-2 kubenswrapper[4762]: I1014 13:13:58.966055 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:13:59.898581 master-2 kubenswrapper[4762]: I1014 13:13:59.898468 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:13:59.898581 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:13:59.898581 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:13:59.898581 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:13:59.898581 master-2 kubenswrapper[4762]: I1014 13:13:59.898567 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:00.898014 master-2 kubenswrapper[4762]: I1014 13:14:00.897794 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:00.898014 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:00.898014 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:00.898014 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:00.898014 master-2 kubenswrapper[4762]: I1014 13:14:00.897866 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:01.897303 master-2 kubenswrapper[4762]: I1014 13:14:01.897228 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:01.897303 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:01.897303 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:01.897303 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:01.897965 master-2 kubenswrapper[4762]: I1014 13:14:01.897322 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:02.898785 master-2 kubenswrapper[4762]: I1014 13:14:02.898643 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:02.898785 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:02.898785 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:02.898785 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:02.898785 master-2 kubenswrapper[4762]: I1014 13:14:02.898772 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: I1014 13:14:03.860736 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:03.860823 master-2 kubenswrapper[4762]: I1014 13:14:03.860810 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:03.897349 master-2 kubenswrapper[4762]: I1014 13:14:03.897247 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:03.897349 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:03.897349 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:03.897349 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:03.897349 master-2 kubenswrapper[4762]: I1014 13:14:03.897327 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:03.897816 master-2 kubenswrapper[4762]: I1014 13:14:03.897369 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:14:03.897862 master-2 kubenswrapper[4762]: I1014 13:14:03.897845 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909"} pod="openshift-ingress/router-default-5ddb89f76-887cs" containerMessage="Container router failed startup probe, will be restarted" Oct 14 13:14:03.897939 master-2 kubenswrapper[4762]: I1014 13:14:03.897880 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" containerID="cri-o://0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909" gracePeriod=3600 Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: I1014 13:14:03.967519 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:03.967631 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:03.968732 master-2 kubenswrapper[4762]: I1014 13:14:03.967646 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: I1014 13:14:08.859817 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:08.859884 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:08.861608 master-2 kubenswrapper[4762]: I1014 13:14:08.859934 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: I1014 13:14:08.967063 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:08.967183 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:08.968086 master-2 kubenswrapper[4762]: I1014 13:14:08.967206 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:10.109285 master-2 kubenswrapper[4762]: I1014 13:14:10.109211 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: I1014 13:14:13.860293 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:13.860399 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:13.862079 master-2 kubenswrapper[4762]: I1014 13:14:13.860416 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: I1014 13:14:13.968042 4762 patch_prober.go:28] interesting pod/apiserver-96c4c446c-728v2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:13.968132 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:13.968846 master-2 kubenswrapper[4762]: I1014 13:14:13.968132 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:17.165202 master-2 kubenswrapper[4762]: I1014 13:14:17.165082 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:14:17.214748 master-2 kubenswrapper[4762]: I1014 13:14:17.214582 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:14:17.215241 master-2 kubenswrapper[4762]: E1014 13:14:17.215138 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" Oct 14 13:14:17.215241 master-2 kubenswrapper[4762]: I1014 13:14:17.215211 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" Oct 14 13:14:17.215398 master-2 kubenswrapper[4762]: E1014 13:14:17.215307 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="fix-audit-permissions" Oct 14 13:14:17.215398 master-2 kubenswrapper[4762]: I1014 13:14:17.215326 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="fix-audit-permissions" Oct 14 13:14:17.215720 master-2 kubenswrapper[4762]: I1014 13:14:17.215626 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerName="oauth-apiserver" Oct 14 13:14:17.217088 master-2 kubenswrapper[4762]: I1014 13:14:17.217045 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.220856 master-2 kubenswrapper[4762]: I1014 13:14:17.220803 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-8gpjk" Oct 14 13:14:17.229929 master-2 kubenswrapper[4762]: I1014 13:14:17.229869 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:14:17.291320 master-2 kubenswrapper[4762]: I1014 13:14:17.291186 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.291755 master-2 kubenswrapper[4762]: I1014 13:14:17.291682 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.291841 master-2 kubenswrapper[4762]: I1014 13:14:17.291760 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.291841 master-2 kubenswrapper[4762]: I1014 13:14:17.291781 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:14:17.291841 master-2 kubenswrapper[4762]: I1014 13:14:17.291804 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.292053 master-2 kubenswrapper[4762]: I1014 13:14:17.291922 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:14:17.292053 master-2 kubenswrapper[4762]: I1014 13:14:17.291935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws22v\" (UniqueName: \"kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.292053 master-2 kubenswrapper[4762]: I1014 13:14:17.292033 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.292299 master-2 kubenswrapper[4762]: I1014 13:14:17.292065 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.292299 master-2 kubenswrapper[4762]: I1014 13:14:17.292115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies\") pod \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\" (UID: \"fdac5df3-de02-49f0-8b90-53464ca0b6dd\") " Oct 14 13:14:17.292429 master-2 kubenswrapper[4762]: I1014 13:14:17.292346 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292429 master-2 kubenswrapper[4762]: I1014 13:14:17.292392 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292557 master-2 kubenswrapper[4762]: I1014 13:14:17.292465 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292557 master-2 kubenswrapper[4762]: I1014 13:14:17.292545 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292678 master-2 kubenswrapper[4762]: I1014 13:14:17.292577 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292678 master-2 kubenswrapper[4762]: I1014 13:14:17.292608 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvb5\" (UniqueName: \"kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292678 master-2 kubenswrapper[4762]: I1014 13:14:17.292651 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292870 master-2 kubenswrapper[4762]: I1014 13:14:17.292835 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.292974 master-2 kubenswrapper[4762]: I1014 13:14:17.292937 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.293061 master-2 kubenswrapper[4762]: I1014 13:14:17.292975 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.293061 master-2 kubenswrapper[4762]: I1014 13:14:17.293028 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:14:17.293317 master-2 kubenswrapper[4762]: I1014 13:14:17.293179 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:14:17.296041 master-2 kubenswrapper[4762]: I1014 13:14:17.295945 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:14:17.296179 master-2 kubenswrapper[4762]: I1014 13:14:17.296121 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:14:17.296450 master-2 kubenswrapper[4762]: I1014 13:14:17.296366 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:14:17.297655 master-2 kubenswrapper[4762]: I1014 13:14:17.297595 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v" (OuterVolumeSpecName: "kube-api-access-ws22v") pod "fdac5df3-de02-49f0-8b90-53464ca0b6dd" (UID: "fdac5df3-de02-49f0-8b90-53464ca0b6dd"). InnerVolumeSpecName "kube-api-access-ws22v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:14:17.393724 master-2 kubenswrapper[4762]: I1014 13:14:17.393627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394010 master-2 kubenswrapper[4762]: I1014 13:14:17.393755 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvb5\" (UniqueName: \"kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394010 master-2 kubenswrapper[4762]: I1014 13:14:17.393813 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394010 master-2 kubenswrapper[4762]: I1014 13:14:17.393895 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394010 master-2 kubenswrapper[4762]: I1014 13:14:17.393967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394328 master-2 kubenswrapper[4762]: I1014 13:14:17.394030 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394328 master-2 kubenswrapper[4762]: I1014 13:14:17.394239 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394853 master-2 kubenswrapper[4762]: I1014 13:14:17.394803 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.394935 master-2 kubenswrapper[4762]: I1014 13:14:17.394884 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.395003 master-2 kubenswrapper[4762]: I1014 13:14:17.394964 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws22v\" (UniqueName: \"kubernetes.io/projected/fdac5df3-de02-49f0-8b90-53464ca0b6dd-kube-api-access-ws22v\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395003 master-2 kubenswrapper[4762]: I1014 13:14:17.394989 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395185 master-2 kubenswrapper[4762]: I1014 13:14:17.395008 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395185 master-2 kubenswrapper[4762]: I1014 13:14:17.395066 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-audit-policies\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395185 master-2 kubenswrapper[4762]: I1014 13:14:17.395085 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdac5df3-de02-49f0-8b90-53464ca0b6dd-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395185 master-2 kubenswrapper[4762]: I1014 13:14:17.395102 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdac5df3-de02-49f0-8b90-53464ca0b6dd-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:17.395580 master-2 kubenswrapper[4762]: I1014 13:14:17.395511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.395724 master-2 kubenswrapper[4762]: I1014 13:14:17.395656 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.395799 master-2 kubenswrapper[4762]: I1014 13:14:17.395663 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.398142 master-2 kubenswrapper[4762]: I1014 13:14:17.398076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.399133 master-2 kubenswrapper[4762]: I1014 13:14:17.398997 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.400776 master-2 kubenswrapper[4762]: I1014 13:14:17.400726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.418901 master-2 kubenswrapper[4762]: I1014 13:14:17.418755 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvb5\" (UniqueName: \"kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5\") pod \"apiserver-7b6784d654-l7lmp\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.535585 master-2 kubenswrapper[4762]: I1014 13:14:17.535483 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:17.631279 master-2 kubenswrapper[4762]: I1014 13:14:17.631200 4762 generic.go:334] "Generic (PLEG): container finished" podID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" containerID="2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385" exitCode=0 Oct 14 13:14:17.631279 master-2 kubenswrapper[4762]: I1014 13:14:17.631261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" event={"ID":"fdac5df3-de02-49f0-8b90-53464ca0b6dd","Type":"ContainerDied","Data":"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385"} Oct 14 13:14:17.631593 master-2 kubenswrapper[4762]: I1014 13:14:17.631299 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" event={"ID":"fdac5df3-de02-49f0-8b90-53464ca0b6dd","Type":"ContainerDied","Data":"0b85ff8b7ef111fd93a34eb70a4acc4cd875cffaccda2f7bde3cde5efda1c05e"} Oct 14 13:14:17.631593 master-2 kubenswrapper[4762]: I1014 13:14:17.631321 4762 scope.go:117] "RemoveContainer" containerID="2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385" Oct 14 13:14:17.631593 master-2 kubenswrapper[4762]: I1014 13:14:17.631312 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-96c4c446c-728v2" Oct 14 13:14:17.654339 master-2 kubenswrapper[4762]: I1014 13:14:17.654292 4762 scope.go:117] "RemoveContainer" containerID="cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b" Oct 14 13:14:17.661561 master-2 kubenswrapper[4762]: I1014 13:14:17.661515 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:14:17.668810 master-2 kubenswrapper[4762]: I1014 13:14:17.668731 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-96c4c446c-728v2"] Oct 14 13:14:17.675944 master-2 kubenswrapper[4762]: I1014 13:14:17.675867 4762 scope.go:117] "RemoveContainer" containerID="2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385" Oct 14 13:14:17.676448 master-2 kubenswrapper[4762]: E1014 13:14:17.676399 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385\": container with ID starting with 2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385 not found: ID does not exist" containerID="2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385" Oct 14 13:14:17.676550 master-2 kubenswrapper[4762]: I1014 13:14:17.676443 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385"} err="failed to get container status \"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385\": rpc error: code = NotFound desc = could not find container \"2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385\": container with ID starting with 2a29595ee8fa59a1d5c479b5f3ce38436d4d22c41f1bcc1270e6b5ac70185385 not found: ID does not exist" Oct 14 13:14:17.676550 master-2 kubenswrapper[4762]: I1014 13:14:17.676464 4762 scope.go:117] "RemoveContainer" containerID="cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b" Oct 14 13:14:17.676828 master-2 kubenswrapper[4762]: E1014 13:14:17.676785 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b\": container with ID starting with cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b not found: ID does not exist" containerID="cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b" Oct 14 13:14:17.676828 master-2 kubenswrapper[4762]: I1014 13:14:17.676815 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b"} err="failed to get container status \"cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b\": rpc error: code = NotFound desc = could not find container \"cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b\": container with ID starting with cf7f4e3c23f7b821d08fbac74a5681f0fc43971c789058587f9ad8f4d64abc5b not found: ID does not exist" Oct 14 13:14:17.974738 master-2 kubenswrapper[4762]: I1014 13:14:17.974696 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:14:17.977916 master-2 kubenswrapper[4762]: W1014 13:14:17.977845 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31803cc5_bd42_4bb2_8872_79acd1f79d5b.slice/crio-806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8 WatchSource:0}: Error finding container 806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8: Status 404 returned error can't find the container with id 806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8 Oct 14 13:14:18.641699 master-2 kubenswrapper[4762]: I1014 13:14:18.641604 4762 generic.go:334] "Generic (PLEG): container finished" podID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerID="17d44eb9784edc26e702ce1deec1a2332094dca3f442de74ff4d90b44b112b27" exitCode=0 Oct 14 13:14:18.642586 master-2 kubenswrapper[4762]: I1014 13:14:18.641684 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" event={"ID":"31803cc5-bd42-4bb2-8872-79acd1f79d5b","Type":"ContainerDied","Data":"17d44eb9784edc26e702ce1deec1a2332094dca3f442de74ff4d90b44b112b27"} Oct 14 13:14:18.642586 master-2 kubenswrapper[4762]: I1014 13:14:18.641769 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" event={"ID":"31803cc5-bd42-4bb2-8872-79acd1f79d5b","Type":"ContainerStarted","Data":"806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8"} Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: I1014 13:14:18.857185 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:18.857234 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:18.858716 master-2 kubenswrapper[4762]: I1014 13:14:18.857264 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:19.559286 master-2 kubenswrapper[4762]: I1014 13:14:19.559201 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdac5df3-de02-49f0-8b90-53464ca0b6dd" path="/var/lib/kubelet/pods/fdac5df3-de02-49f0-8b90-53464ca0b6dd/volumes" Oct 14 13:14:19.654014 master-2 kubenswrapper[4762]: I1014 13:14:19.653868 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" event={"ID":"31803cc5-bd42-4bb2-8872-79acd1f79d5b","Type":"ContainerStarted","Data":"86954708c4083ce02b3287f821780b8c962df87887fa2a2204ed39142954e4f0"} Oct 14 13:14:19.689896 master-2 kubenswrapper[4762]: I1014 13:14:19.689769 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podStartSLOduration=54.689738697 podStartE2EDuration="54.689738697s" podCreationTimestamp="2025-10-14 13:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:14:19.685328072 +0000 UTC m=+488.929487291" watchObservedRunningTime="2025-10-14 13:14:19.689738697 +0000 UTC m=+488.933897866" Oct 14 13:14:22.536540 master-2 kubenswrapper[4762]: I1014 13:14:22.536416 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:22.538783 master-2 kubenswrapper[4762]: I1014 13:14:22.538733 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:22.546656 master-2 kubenswrapper[4762]: I1014 13:14:22.546607 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:22.676698 master-2 kubenswrapper[4762]: I1014 13:14:22.676654 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: I1014 13:14:23.860321 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:23.860392 master-2 kubenswrapper[4762]: I1014 13:14:23.860431 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:24.560169 master-2 kubenswrapper[4762]: I1014 13:14:24.560088 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: I1014 13:14:28.860550 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:28.860622 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:28.863701 master-2 kubenswrapper[4762]: I1014 13:14:28.860632 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: I1014 13:14:33.857505 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:33.857629 master-2 kubenswrapper[4762]: I1014 13:14:33.857622 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: I1014 13:14:38.857680 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:38.857754 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:38.859373 master-2 kubenswrapper[4762]: I1014 13:14:38.857831 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:40.113673 master-2 kubenswrapper[4762]: I1014 13:14:40.113601 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: I1014 13:14:43.861257 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:43.861422 master-2 kubenswrapper[4762]: I1014 13:14:43.861337 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:48.180728 master-2 kubenswrapper[4762]: I1014 13:14:48.180660 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:14:48.181367 master-2 kubenswrapper[4762]: I1014 13:14:48.180998 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" containerID="cri-o://bbd14ec96da76e6b6b207839405f5858f9bdefe7cec9b0ffa533a5f314702f25" gracePeriod=170 Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: I1014 13:14:48.857423 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:14:48.857490 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:14:48.859910 master-2 kubenswrapper[4762]: I1014 13:14:48.859306 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:50.837436 master-2 kubenswrapper[4762]: I1014 13:14:50.837334 4762 generic.go:334] "Generic (PLEG): container finished" podID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerID="0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909" exitCode=0 Oct 14 13:14:50.837436 master-2 kubenswrapper[4762]: I1014 13:14:50.837389 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerDied","Data":"0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909"} Oct 14 13:14:50.837436 master-2 kubenswrapper[4762]: I1014 13:14:50.837427 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerStarted","Data":"057583bf3783414547befa224cdecf27ac7a84e0c0a4ef9a6cd7473f3af7d3db"} Oct 14 13:14:50.837436 master-2 kubenswrapper[4762]: I1014 13:14:50.837447 4762 scope.go:117] "RemoveContainer" containerID="55da1c19b96c1c89292ad340fab59b6898e10e1d95dce9f948d8c32e32bcd047" Oct 14 13:14:50.895453 master-2 kubenswrapper[4762]: I1014 13:14:50.895374 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:14:50.898609 master-2 kubenswrapper[4762]: I1014 13:14:50.898544 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:50.898609 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:50.898609 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:50.898609 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:50.898933 master-2 kubenswrapper[4762]: I1014 13:14:50.898616 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:51.897344 master-2 kubenswrapper[4762]: I1014 13:14:51.897244 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:51.897344 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:51.897344 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:51.897344 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:51.898401 master-2 kubenswrapper[4762]: I1014 13:14:51.897414 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:52.897888 master-2 kubenswrapper[4762]: I1014 13:14:52.897805 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:52.897888 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:52.897888 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:52.897888 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:52.898762 master-2 kubenswrapper[4762]: I1014 13:14:52.897905 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:53.853105 master-2 kubenswrapper[4762]: I1014 13:14:53.853046 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" start-of-body= Oct 14 13:14:53.853614 master-2 kubenswrapper[4762]: I1014 13:14:53.853585 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" Oct 14 13:14:53.897215 master-2 kubenswrapper[4762]: I1014 13:14:53.896938 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:53.897215 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:53.897215 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:53.897215 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:53.897215 master-2 kubenswrapper[4762]: I1014 13:14:53.897039 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:54.187881 master-2 kubenswrapper[4762]: I1014 13:14:54.187830 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_79535145b65a4e1d50292e1c2670257a/kube-apiserver-cert-syncer/0.log" Oct 14 13:14:54.189315 master-2 kubenswrapper[4762]: I1014 13:14:54.189282 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:14:54.195071 master-2 kubenswrapper[4762]: I1014 13:14:54.195001 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="79535145b65a4e1d50292e1c2670257a" podUID="9041570beb5002e8da158e70e12f0c16" Oct 14 13:14:54.367707 master-2 kubenswrapper[4762]: I1014 13:14:54.367514 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir\") pod \"79535145b65a4e1d50292e1c2670257a\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " Oct 14 13:14:54.367707 master-2 kubenswrapper[4762]: I1014 13:14:54.367590 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir\") pod \"79535145b65a4e1d50292e1c2670257a\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " Oct 14 13:14:54.367707 master-2 kubenswrapper[4762]: I1014 13:14:54.367633 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir\") pod \"79535145b65a4e1d50292e1c2670257a\" (UID: \"79535145b65a4e1d50292e1c2670257a\") " Oct 14 13:14:54.367707 master-2 kubenswrapper[4762]: I1014 13:14:54.367667 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "79535145b65a4e1d50292e1c2670257a" (UID: "79535145b65a4e1d50292e1c2670257a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:14:54.367707 master-2 kubenswrapper[4762]: I1014 13:14:54.367688 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "79535145b65a4e1d50292e1c2670257a" (UID: "79535145b65a4e1d50292e1c2670257a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:14:54.368298 master-2 kubenswrapper[4762]: I1014 13:14:54.367819 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "79535145b65a4e1d50292e1c2670257a" (UID: "79535145b65a4e1d50292e1c2670257a"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:14:54.368298 master-2 kubenswrapper[4762]: I1014 13:14:54.367998 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:54.368298 master-2 kubenswrapper[4762]: I1014 13:14:54.368013 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:54.368298 master-2 kubenswrapper[4762]: I1014 13:14:54.368022 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79535145b65a4e1d50292e1c2670257a-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:14:54.865176 master-2 kubenswrapper[4762]: I1014 13:14:54.865119 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_79535145b65a4e1d50292e1c2670257a/kube-apiserver-cert-syncer/0.log" Oct 14 13:14:54.866058 master-2 kubenswrapper[4762]: I1014 13:14:54.866012 4762 generic.go:334] "Generic (PLEG): container finished" podID="79535145b65a4e1d50292e1c2670257a" containerID="a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba" exitCode=0 Oct 14 13:14:54.866116 master-2 kubenswrapper[4762]: I1014 13:14:54.866084 4762 scope.go:117] "RemoveContainer" containerID="6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc" Oct 14 13:14:54.866269 master-2 kubenswrapper[4762]: I1014 13:14:54.866224 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:14:54.872643 master-2 kubenswrapper[4762]: I1014 13:14:54.872592 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="79535145b65a4e1d50292e1c2670257a" podUID="9041570beb5002e8da158e70e12f0c16" Oct 14 13:14:54.883949 master-2 kubenswrapper[4762]: I1014 13:14:54.883678 4762 scope.go:117] "RemoveContainer" containerID="d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8" Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: I1014 13:14:54.897021 4762 scope.go:117] "RemoveContainer" containerID="4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2" Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: I1014 13:14:54.897726 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:54.897816 master-2 kubenswrapper[4762]: I1014 13:14:54.897775 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:54.898378 master-2 kubenswrapper[4762]: I1014 13:14:54.897925 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="79535145b65a4e1d50292e1c2670257a" podUID="9041570beb5002e8da158e70e12f0c16" Oct 14 13:14:54.909813 master-2 kubenswrapper[4762]: I1014 13:14:54.909747 4762 scope.go:117] "RemoveContainer" containerID="5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12" Oct 14 13:14:54.925298 master-2 kubenswrapper[4762]: I1014 13:14:54.923381 4762 scope.go:117] "RemoveContainer" containerID="a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba" Oct 14 13:14:54.947198 master-2 kubenswrapper[4762]: I1014 13:14:54.947148 4762 scope.go:117] "RemoveContainer" containerID="03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69" Oct 14 13:14:54.968235 master-2 kubenswrapper[4762]: I1014 13:14:54.968139 4762 scope.go:117] "RemoveContainer" containerID="6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc" Oct 14 13:14:54.968888 master-2 kubenswrapper[4762]: E1014 13:14:54.968813 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc\": container with ID starting with 6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc not found: ID does not exist" containerID="6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc" Oct 14 13:14:54.968959 master-2 kubenswrapper[4762]: I1014 13:14:54.968891 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc"} err="failed to get container status \"6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc\": rpc error: code = NotFound desc = could not find container \"6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc\": container with ID starting with 6b10e6673b93c4a192c56727c3cb7534636f9b2a41ce2b6024c0ba9c263d8bdc not found: ID does not exist" Oct 14 13:14:54.968959 master-2 kubenswrapper[4762]: I1014 13:14:54.968949 4762 scope.go:117] "RemoveContainer" containerID="d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8" Oct 14 13:14:54.969867 master-2 kubenswrapper[4762]: E1014 13:14:54.969795 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8\": container with ID starting with d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8 not found: ID does not exist" containerID="d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8" Oct 14 13:14:54.969953 master-2 kubenswrapper[4762]: I1014 13:14:54.969868 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8"} err="failed to get container status \"d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8\": rpc error: code = NotFound desc = could not find container \"d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8\": container with ID starting with d9d2221fab10b9011cadcd36f834cf0835e5aaeb80376ffb7752afdf91447fd8 not found: ID does not exist" Oct 14 13:14:54.969953 master-2 kubenswrapper[4762]: I1014 13:14:54.969911 4762 scope.go:117] "RemoveContainer" containerID="4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2" Oct 14 13:14:54.970359 master-2 kubenswrapper[4762]: E1014 13:14:54.970324 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2\": container with ID starting with 4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2 not found: ID does not exist" containerID="4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2" Oct 14 13:14:54.970425 master-2 kubenswrapper[4762]: I1014 13:14:54.970368 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2"} err="failed to get container status \"4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2\": rpc error: code = NotFound desc = could not find container \"4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2\": container with ID starting with 4a1bfa0a3404825bcf8ebab4cdda7490f674619956b402166636fd113314e8c2 not found: ID does not exist" Oct 14 13:14:54.970425 master-2 kubenswrapper[4762]: I1014 13:14:54.970385 4762 scope.go:117] "RemoveContainer" containerID="5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12" Oct 14 13:14:54.970868 master-2 kubenswrapper[4762]: E1014 13:14:54.970814 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12\": container with ID starting with 5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12 not found: ID does not exist" containerID="5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12" Oct 14 13:14:54.970868 master-2 kubenswrapper[4762]: I1014 13:14:54.970860 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12"} err="failed to get container status \"5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12\": rpc error: code = NotFound desc = could not find container \"5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12\": container with ID starting with 5ce4974fc0875864b1435afc4e321ca895fa86762184409dc8aa8ab244db1a12 not found: ID does not exist" Oct 14 13:14:54.971129 master-2 kubenswrapper[4762]: I1014 13:14:54.970961 4762 scope.go:117] "RemoveContainer" containerID="a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba" Oct 14 13:14:54.971608 master-2 kubenswrapper[4762]: E1014 13:14:54.971579 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba\": container with ID starting with a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba not found: ID does not exist" containerID="a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba" Oct 14 13:14:54.971608 master-2 kubenswrapper[4762]: I1014 13:14:54.971609 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba"} err="failed to get container status \"a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba\": rpc error: code = NotFound desc = could not find container \"a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba\": container with ID starting with a00e67168756607c42fb7608d7f9c240e721a608698655834914631238bee1ba not found: ID does not exist" Oct 14 13:14:54.971850 master-2 kubenswrapper[4762]: I1014 13:14:54.971626 4762 scope.go:117] "RemoveContainer" containerID="03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69" Oct 14 13:14:54.972094 master-2 kubenswrapper[4762]: E1014 13:14:54.972044 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69\": container with ID starting with 03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69 not found: ID does not exist" containerID="03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69" Oct 14 13:14:54.972094 master-2 kubenswrapper[4762]: I1014 13:14:54.972067 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69"} err="failed to get container status \"03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69\": rpc error: code = NotFound desc = could not find container \"03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69\": container with ID starting with 03ee826fcc201dac1b360f7f1ba1f945c12a67d2f924513a88a5779ad17b7e69 not found: ID does not exist" Oct 14 13:14:55.560416 master-2 kubenswrapper[4762]: I1014 13:14:55.560304 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79535145b65a4e1d50292e1c2670257a" path="/var/lib/kubelet/pods/79535145b65a4e1d50292e1c2670257a/volumes" Oct 14 13:14:55.895408 master-2 kubenswrapper[4762]: I1014 13:14:55.895251 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:14:55.897976 master-2 kubenswrapper[4762]: I1014 13:14:55.897932 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:55.897976 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:55.897976 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:55.897976 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:55.898258 master-2 kubenswrapper[4762]: I1014 13:14:55.897997 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:56.898131 master-2 kubenswrapper[4762]: I1014 13:14:56.898028 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:56.898131 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:56.898131 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:56.898131 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:56.898131 master-2 kubenswrapper[4762]: I1014 13:14:56.898122 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:57.896990 master-2 kubenswrapper[4762]: I1014 13:14:57.896919 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:57.896990 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:57.896990 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:57.896990 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:57.897691 master-2 kubenswrapper[4762]: I1014 13:14:57.897638 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:58.548423 master-2 kubenswrapper[4762]: I1014 13:14:58.548322 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:14:58.566850 master-2 kubenswrapper[4762]: I1014 13:14:58.566789 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="ea7edee6-939a-4b2c-ac8f-c4467348f1c7" Oct 14 13:14:58.566850 master-2 kubenswrapper[4762]: I1014 13:14:58.566844 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="ea7edee6-939a-4b2c-ac8f-c4467348f1c7" Oct 14 13:14:58.583952 master-2 kubenswrapper[4762]: I1014 13:14:58.583835 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:14:58.588504 master-2 kubenswrapper[4762]: I1014 13:14:58.588447 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:14:58.594508 master-2 kubenswrapper[4762]: I1014 13:14:58.594444 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:14:58.608030 master-2 kubenswrapper[4762]: I1014 13:14:58.607955 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:14:58.614207 master-2 kubenswrapper[4762]: I1014 13:14:58.614107 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:14:58.634202 master-2 kubenswrapper[4762]: W1014 13:14:58.634094 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9041570beb5002e8da158e70e12f0c16.slice/crio-453ee5f01b8c9543b6f2ff7c2484d1ca1db3375e32b10a3fcf8a868bcea6ed64 WatchSource:0}: Error finding container 453ee5f01b8c9543b6f2ff7c2484d1ca1db3375e32b10a3fcf8a868bcea6ed64: Status 404 returned error can't find the container with id 453ee5f01b8c9543b6f2ff7c2484d1ca1db3375e32b10a3fcf8a868bcea6ed64 Oct 14 13:14:58.853359 master-2 kubenswrapper[4762]: I1014 13:14:58.853310 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" start-of-body= Oct 14 13:14:58.853507 master-2 kubenswrapper[4762]: I1014 13:14:58.853390 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" Oct 14 13:14:58.892376 master-2 kubenswrapper[4762]: I1014 13:14:58.892316 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"453ee5f01b8c9543b6f2ff7c2484d1ca1db3375e32b10a3fcf8a868bcea6ed64"} Oct 14 13:14:58.896853 master-2 kubenswrapper[4762]: I1014 13:14:58.896784 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:58.896853 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:58.896853 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:58.896853 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:58.896990 master-2 kubenswrapper[4762]: I1014 13:14:58.896882 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:59.897293 master-2 kubenswrapper[4762]: I1014 13:14:59.897228 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:14:59.897293 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:14:59.897293 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:14:59.897293 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:14:59.898425 master-2 kubenswrapper[4762]: I1014 13:14:59.897328 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:14:59.901971 master-2 kubenswrapper[4762]: I1014 13:14:59.901910 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d" exitCode=0 Oct 14 13:14:59.902119 master-2 kubenswrapper[4762]: I1014 13:14:59.901967 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerDied","Data":"eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d"} Oct 14 13:15:00.184546 master-2 kubenswrapper[4762]: I1014 13:15:00.184494 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5"] Oct 14 13:15:00.185147 master-2 kubenswrapper[4762]: I1014 13:15:00.185122 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.187504 master-2 kubenswrapper[4762]: I1014 13:15:00.187432 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 14 13:15:00.187504 master-2 kubenswrapper[4762]: I1014 13:15:00.187435 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-t5gjh" Oct 14 13:15:00.188112 master-2 kubenswrapper[4762]: I1014 13:15:00.188083 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 14 13:15:00.195210 master-2 kubenswrapper[4762]: I1014 13:15:00.195133 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5"] Oct 14 13:15:00.343476 master-2 kubenswrapper[4762]: I1014 13:15:00.341005 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.343476 master-2 kubenswrapper[4762]: I1014 13:15:00.341063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwbf\" (UniqueName: \"kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.343476 master-2 kubenswrapper[4762]: I1014 13:15:00.341709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.442771 master-2 kubenswrapper[4762]: I1014 13:15:00.442721 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwbf\" (UniqueName: \"kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.442771 master-2 kubenswrapper[4762]: I1014 13:15:00.442792 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.442771 master-2 kubenswrapper[4762]: I1014 13:15:00.442836 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.444091 master-2 kubenswrapper[4762]: I1014 13:15:00.444048 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.446220 master-2 kubenswrapper[4762]: I1014 13:15:00.446184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.472621 master-2 kubenswrapper[4762]: I1014 13:15:00.472573 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwbf\" (UniqueName: \"kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf\") pod \"collect-profiles-29340795-t5kx5\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.510520 master-2 kubenswrapper[4762]: I1014 13:15:00.510445 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:00.896584 master-2 kubenswrapper[4762]: I1014 13:15:00.896514 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:00.896584 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:00.896584 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:00.896584 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:00.896906 master-2 kubenswrapper[4762]: I1014 13:15:00.896593 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:00.933232 master-2 kubenswrapper[4762]: I1014 13:15:00.932931 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5"] Oct 14 13:15:00.945425 master-2 kubenswrapper[4762]: I1014 13:15:00.944580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d"} Oct 14 13:15:00.945425 master-2 kubenswrapper[4762]: I1014 13:15:00.944627 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410"} Oct 14 13:15:00.945425 master-2 kubenswrapper[4762]: I1014 13:15:00.944653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a"} Oct 14 13:15:00.945425 master-2 kubenswrapper[4762]: I1014 13:15:00.944661 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a"} Oct 14 13:15:01.896943 master-2 kubenswrapper[4762]: I1014 13:15:01.896848 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:01.896943 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:01.896943 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:01.896943 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:01.897422 master-2 kubenswrapper[4762]: I1014 13:15:01.896941 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:01.952180 master-2 kubenswrapper[4762]: I1014 13:15:01.952102 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" event={"ID":"29941561-0dd2-4fbc-a503-f42b2527e405","Type":"ContainerStarted","Data":"fe41a9518ad517a60f5ed8aa178d9fa7552cfd0101e2f6a2321d58ab03ac48c4"} Oct 14 13:15:01.952738 master-2 kubenswrapper[4762]: I1014 13:15:01.952198 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" event={"ID":"29941561-0dd2-4fbc-a503-f42b2527e405","Type":"ContainerStarted","Data":"71af2907d42d35ac1c963d531c5db887a65cf7f3ebd219dc8ba22db251e35a4a"} Oct 14 13:15:01.957278 master-2 kubenswrapper[4762]: I1014 13:15:01.957217 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"9041570beb5002e8da158e70e12f0c16","Type":"ContainerStarted","Data":"ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee"} Oct 14 13:15:01.957495 master-2 kubenswrapper[4762]: I1014 13:15:01.957435 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:01.976849 master-2 kubenswrapper[4762]: I1014 13:15:01.976754 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" podStartSLOduration=1.976732124 podStartE2EDuration="1.976732124s" podCreationTimestamp="2025-10-14 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:15:01.975215873 +0000 UTC m=+531.219375032" watchObservedRunningTime="2025-10-14 13:15:01.976732124 +0000 UTC m=+531.220891283" Oct 14 13:15:01.997789 master-2 kubenswrapper[4762]: I1014 13:15:01.997692 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-2" podStartSLOduration=3.997659754 podStartE2EDuration="3.997659754s" podCreationTimestamp="2025-10-14 13:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:15:01.993968302 +0000 UTC m=+531.238127541" watchObservedRunningTime="2025-10-14 13:15:01.997659754 +0000 UTC m=+531.241818953" Oct 14 13:15:02.898997 master-2 kubenswrapper[4762]: I1014 13:15:02.898929 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:02.898997 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:02.898997 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:02.898997 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:02.899341 master-2 kubenswrapper[4762]: I1014 13:15:02.899015 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:03.609112 master-2 kubenswrapper[4762]: I1014 13:15:03.609047 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:03.609112 master-2 kubenswrapper[4762]: I1014 13:15:03.609112 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:03.616225 master-2 kubenswrapper[4762]: I1014 13:15:03.616176 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:03.857786 master-2 kubenswrapper[4762]: I1014 13:15:03.857715 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:15:03.897629 master-2 kubenswrapper[4762]: I1014 13:15:03.897562 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:03.897629 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:03.897629 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:03.897629 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:03.897972 master-2 kubenswrapper[4762]: I1014 13:15:03.897642 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:03.969490 master-2 kubenswrapper[4762]: I1014 13:15:03.969419 4762 generic.go:334] "Generic (PLEG): container finished" podID="29941561-0dd2-4fbc-a503-f42b2527e405" containerID="fe41a9518ad517a60f5ed8aa178d9fa7552cfd0101e2f6a2321d58ab03ac48c4" exitCode=0 Oct 14 13:15:03.969731 master-2 kubenswrapper[4762]: I1014 13:15:03.969519 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" event={"ID":"29941561-0dd2-4fbc-a503-f42b2527e405","Type":"ContainerDied","Data":"fe41a9518ad517a60f5ed8aa178d9fa7552cfd0101e2f6a2321d58ab03ac48c4"} Oct 14 13:15:03.973965 master-2 kubenswrapper[4762]: I1014 13:15:03.973913 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:04.897286 master-2 kubenswrapper[4762]: I1014 13:15:04.897186 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:04.897286 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:04.897286 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:04.897286 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:04.897286 master-2 kubenswrapper[4762]: I1014 13:15:04.897307 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:05.287611 master-2 kubenswrapper[4762]: I1014 13:15:05.287548 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:05.405794 master-2 kubenswrapper[4762]: I1014 13:15:05.405702 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume\") pod \"29941561-0dd2-4fbc-a503-f42b2527e405\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " Oct 14 13:15:05.405794 master-2 kubenswrapper[4762]: I1014 13:15:05.405796 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fwbf\" (UniqueName: \"kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf\") pod \"29941561-0dd2-4fbc-a503-f42b2527e405\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " Oct 14 13:15:05.406144 master-2 kubenswrapper[4762]: I1014 13:15:05.405916 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume\") pod \"29941561-0dd2-4fbc-a503-f42b2527e405\" (UID: \"29941561-0dd2-4fbc-a503-f42b2527e405\") " Oct 14 13:15:05.406974 master-2 kubenswrapper[4762]: I1014 13:15:05.406875 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume" (OuterVolumeSpecName: "config-volume") pod "29941561-0dd2-4fbc-a503-f42b2527e405" (UID: "29941561-0dd2-4fbc-a503-f42b2527e405"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:15:05.410418 master-2 kubenswrapper[4762]: I1014 13:15:05.410298 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf" (OuterVolumeSpecName: "kube-api-access-8fwbf") pod "29941561-0dd2-4fbc-a503-f42b2527e405" (UID: "29941561-0dd2-4fbc-a503-f42b2527e405"). InnerVolumeSpecName "kube-api-access-8fwbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:15:05.411074 master-2 kubenswrapper[4762]: I1014 13:15:05.411003 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29941561-0dd2-4fbc-a503-f42b2527e405" (UID: "29941561-0dd2-4fbc-a503-f42b2527e405"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:15:05.508183 master-2 kubenswrapper[4762]: I1014 13:15:05.507972 4762 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29941561-0dd2-4fbc-a503-f42b2527e405-secret-volume\") on node \"master-2\" DevicePath \"\"" Oct 14 13:15:05.508183 master-2 kubenswrapper[4762]: I1014 13:15:05.508038 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fwbf\" (UniqueName: \"kubernetes.io/projected/29941561-0dd2-4fbc-a503-f42b2527e405-kube-api-access-8fwbf\") on node \"master-2\" DevicePath \"\"" Oct 14 13:15:05.508183 master-2 kubenswrapper[4762]: I1014 13:15:05.508054 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29941561-0dd2-4fbc-a503-f42b2527e405-config-volume\") on node \"master-2\" DevicePath \"\"" Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: I1014 13:15:05.675840 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:15:05.675907 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:15:05.676478 master-2 kubenswrapper[4762]: I1014 13:15:05.675946 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:05.897520 master-2 kubenswrapper[4762]: I1014 13:15:05.897362 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:05.897520 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:05.897520 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:05.897520 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:05.898460 master-2 kubenswrapper[4762]: I1014 13:15:05.898284 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:05.986043 master-2 kubenswrapper[4762]: I1014 13:15:05.985923 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" Oct 14 13:15:05.986043 master-2 kubenswrapper[4762]: I1014 13:15:05.985902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5" event={"ID":"29941561-0dd2-4fbc-a503-f42b2527e405","Type":"ContainerDied","Data":"71af2907d42d35ac1c963d531c5db887a65cf7f3ebd219dc8ba22db251e35a4a"} Oct 14 13:15:05.986043 master-2 kubenswrapper[4762]: I1014 13:15:05.986027 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71af2907d42d35ac1c963d531c5db887a65cf7f3ebd219dc8ba22db251e35a4a" Oct 14 13:15:06.896943 master-2 kubenswrapper[4762]: I1014 13:15:06.896855 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:06.896943 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:06.896943 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:06.896943 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:06.896943 master-2 kubenswrapper[4762]: I1014 13:15:06.896927 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:07.897545 master-2 kubenswrapper[4762]: I1014 13:15:07.897429 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:07.897545 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:07.897545 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:07.897545 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:07.898518 master-2 kubenswrapper[4762]: I1014 13:15:07.897560 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:08.897761 master-2 kubenswrapper[4762]: I1014 13:15:08.897690 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:08.897761 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:08.897761 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:08.897761 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:08.898746 master-2 kubenswrapper[4762]: I1014 13:15:08.897786 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:09.898131 master-2 kubenswrapper[4762]: I1014 13:15:09.898023 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:09.898131 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:09.898131 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:09.898131 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:09.898131 master-2 kubenswrapper[4762]: I1014 13:15:09.898095 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:10.111526 master-2 kubenswrapper[4762]: I1014 13:15:10.111447 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:15:10.897046 master-2 kubenswrapper[4762]: I1014 13:15:10.896941 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:10.897046 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:10.897046 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:10.897046 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:10.897046 master-2 kubenswrapper[4762]: I1014 13:15:10.897038 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:11.897451 master-2 kubenswrapper[4762]: I1014 13:15:11.897358 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:11.897451 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:11.897451 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:11.897451 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:11.897451 master-2 kubenswrapper[4762]: I1014 13:15:11.897435 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:12.897975 master-2 kubenswrapper[4762]: I1014 13:15:12.897869 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:12.897975 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:12.897975 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:12.897975 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:12.898886 master-2 kubenswrapper[4762]: I1014 13:15:12.897998 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:13.897635 master-2 kubenswrapper[4762]: I1014 13:15:13.897578 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:13.897635 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:13.897635 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:13.897635 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:13.897919 master-2 kubenswrapper[4762]: I1014 13:15:13.897640 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:14.896867 master-2 kubenswrapper[4762]: I1014 13:15:14.896777 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:14.896867 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:14.896867 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:14.896867 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:14.896867 master-2 kubenswrapper[4762]: I1014 13:15:14.896869 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:15.896468 master-2 kubenswrapper[4762]: I1014 13:15:15.896407 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:15.896468 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:15.896468 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:15.896468 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:15.896832 master-2 kubenswrapper[4762]: I1014 13:15:15.896470 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:16.897450 master-2 kubenswrapper[4762]: I1014 13:15:16.897349 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:16.897450 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:16.897450 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:16.897450 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:16.898669 master-2 kubenswrapper[4762]: I1014 13:15:16.897458 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:17.897641 master-2 kubenswrapper[4762]: I1014 13:15:17.897523 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:17.897641 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:17.897641 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:17.897641 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:17.898602 master-2 kubenswrapper[4762]: I1014 13:15:17.897637 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:18.614760 master-2 kubenswrapper[4762]: I1014 13:15:18.614694 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:15:18.897979 master-2 kubenswrapper[4762]: I1014 13:15:18.897773 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:18.897979 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:18.897979 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:18.897979 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:18.897979 master-2 kubenswrapper[4762]: I1014 13:15:18.897879 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:19.898379 master-2 kubenswrapper[4762]: I1014 13:15:19.898303 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:19.898379 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:19.898379 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:19.898379 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:19.899355 master-2 kubenswrapper[4762]: I1014 13:15:19.898392 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:20.897671 master-2 kubenswrapper[4762]: I1014 13:15:20.897573 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:20.897671 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:20.897671 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:20.897671 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:20.898080 master-2 kubenswrapper[4762]: I1014 13:15:20.897673 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:21.897879 master-2 kubenswrapper[4762]: I1014 13:15:21.897760 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:21.897879 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:21.897879 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:21.897879 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:21.897879 master-2 kubenswrapper[4762]: I1014 13:15:21.897848 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:22.897359 master-2 kubenswrapper[4762]: I1014 13:15:22.897248 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:22.897359 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:22.897359 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:22.897359 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:22.897954 master-2 kubenswrapper[4762]: I1014 13:15:22.897363 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:23.897888 master-2 kubenswrapper[4762]: I1014 13:15:23.897801 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:23.897888 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:23.897888 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:23.897888 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:23.898617 master-2 kubenswrapper[4762]: I1014 13:15:23.897881 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:24.897956 master-2 kubenswrapper[4762]: I1014 13:15:24.897828 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:24.897956 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:24.897956 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:24.897956 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:24.898906 master-2 kubenswrapper[4762]: I1014 13:15:24.897963 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: I1014 13:15:25.634651 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:15:25.634769 master-2 kubenswrapper[4762]: I1014 13:15:25.634742 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:25.897140 master-2 kubenswrapper[4762]: I1014 13:15:25.897010 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:25.897140 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:25.897140 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:25.897140 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:25.897140 master-2 kubenswrapper[4762]: I1014 13:15:25.897109 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:26.897774 master-2 kubenswrapper[4762]: I1014 13:15:26.897637 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:26.897774 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:26.897774 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:26.897774 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:26.898913 master-2 kubenswrapper[4762]: I1014 13:15:26.897782 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:27.897185 master-2 kubenswrapper[4762]: I1014 13:15:27.897093 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:27.897185 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:27.897185 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:27.897185 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:27.897586 master-2 kubenswrapper[4762]: I1014 13:15:27.897194 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:28.897571 master-2 kubenswrapper[4762]: I1014 13:15:28.897496 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:28.897571 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:28.897571 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:28.897571 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:28.898481 master-2 kubenswrapper[4762]: I1014 13:15:28.897609 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:29.897191 master-2 kubenswrapper[4762]: I1014 13:15:29.897119 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:29.897191 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:29.897191 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:29.897191 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:29.897533 master-2 kubenswrapper[4762]: I1014 13:15:29.897225 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:30.899646 master-2 kubenswrapper[4762]: I1014 13:15:30.899560 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:30.899646 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:30.899646 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:30.899646 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:30.899646 master-2 kubenswrapper[4762]: I1014 13:15:30.899649 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:31.897829 master-2 kubenswrapper[4762]: I1014 13:15:31.897790 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:31.897829 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:31.897829 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:31.897829 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:31.898193 master-2 kubenswrapper[4762]: I1014 13:15:31.898147 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:32.897247 master-2 kubenswrapper[4762]: I1014 13:15:32.897192 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:32.897247 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:32.897247 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:32.897247 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:32.897800 master-2 kubenswrapper[4762]: I1014 13:15:32.897269 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:32.904369 master-2 kubenswrapper[4762]: I1014 13:15:32.904321 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-5-master-2"] Oct 14 13:15:32.904601 master-2 kubenswrapper[4762]: E1014 13:15:32.904569 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29941561-0dd2-4fbc-a503-f42b2527e405" containerName="collect-profiles" Oct 14 13:15:32.904601 master-2 kubenswrapper[4762]: I1014 13:15:32.904598 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="29941561-0dd2-4fbc-a503-f42b2527e405" containerName="collect-profiles" Oct 14 13:15:32.904757 master-2 kubenswrapper[4762]: I1014 13:15:32.904734 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="29941561-0dd2-4fbc-a503-f42b2527e405" containerName="collect-profiles" Oct 14 13:15:32.905221 master-2 kubenswrapper[4762]: I1014 13:15:32.905194 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:32.909212 master-2 kubenswrapper[4762]: I1014 13:15:32.909119 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbs2c" Oct 14 13:15:32.918402 master-2 kubenswrapper[4762]: I1014 13:15:32.918319 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-5-master-2"] Oct 14 13:15:33.053927 master-2 kubenswrapper[4762]: I1014 13:15:33.053836 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.054269 master-2 kubenswrapper[4762]: I1014 13:15:33.054084 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.054269 master-2 kubenswrapper[4762]: I1014 13:15:33.054125 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.155461 master-2 kubenswrapper[4762]: I1014 13:15:33.155202 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.155461 master-2 kubenswrapper[4762]: I1014 13:15:33.155276 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.155461 master-2 kubenswrapper[4762]: I1014 13:15:33.155337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.155461 master-2 kubenswrapper[4762]: I1014 13:15:33.155346 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.155947 master-2 kubenswrapper[4762]: I1014 13:15:33.155503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.184784 master-2 kubenswrapper[4762]: I1014 13:15:33.184709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access\") pod \"installer-5-master-2\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.221515 master-2 kubenswrapper[4762]: I1014 13:15:33.221389 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-2" Oct 14 13:15:33.694602 master-2 kubenswrapper[4762]: I1014 13:15:33.694514 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-5-master-2"] Oct 14 13:15:33.896391 master-2 kubenswrapper[4762]: I1014 13:15:33.896276 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:33.896391 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:33.896391 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:33.896391 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:33.896391 master-2 kubenswrapper[4762]: I1014 13:15:33.896383 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:34.161207 master-2 kubenswrapper[4762]: I1014 13:15:34.161118 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-5-master-2" event={"ID":"08b313e4-ea57-4f9c-ad72-1f640ef21c52","Type":"ContainerStarted","Data":"d7d08b38f7e0af5214dbb27e0ea5e13ef41ff6f6d36bc1e8272e35547c0ac516"} Oct 14 13:15:34.161207 master-2 kubenswrapper[4762]: I1014 13:15:34.161188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-5-master-2" event={"ID":"08b313e4-ea57-4f9c-ad72-1f640ef21c52","Type":"ContainerStarted","Data":"0f8d69fafa1a065a947cd683b9b7fd0885809880410ea1e61811e6730c8bbe87"} Oct 14 13:15:34.181379 master-2 kubenswrapper[4762]: I1014 13:15:34.181260 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-5-master-2" podStartSLOduration=2.181241836 podStartE2EDuration="2.181241836s" podCreationTimestamp="2025-10-14 13:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:15:34.178481745 +0000 UTC m=+563.422640934" watchObservedRunningTime="2025-10-14 13:15:34.181241836 +0000 UTC m=+563.425400995" Oct 14 13:15:34.897247 master-2 kubenswrapper[4762]: I1014 13:15:34.897144 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:34.897247 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:34.897247 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:34.897247 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:34.897687 master-2 kubenswrapper[4762]: I1014 13:15:34.897277 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:35.897716 master-2 kubenswrapper[4762]: I1014 13:15:35.897603 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:35.897716 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:35.897716 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:35.897716 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:35.898725 master-2 kubenswrapper[4762]: I1014 13:15:35.897713 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:36.898627 master-2 kubenswrapper[4762]: I1014 13:15:36.898542 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:36.898627 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:36.898627 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:36.898627 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:36.899303 master-2 kubenswrapper[4762]: I1014 13:15:36.898665 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:37.897970 master-2 kubenswrapper[4762]: I1014 13:15:37.897895 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:37.897970 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:37.897970 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:37.897970 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:37.898315 master-2 kubenswrapper[4762]: I1014 13:15:37.897991 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:38.896681 master-2 kubenswrapper[4762]: I1014 13:15:38.896610 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:38.896681 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:38.896681 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:38.896681 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:38.896681 master-2 kubenswrapper[4762]: I1014 13:15:38.896671 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:39.897319 master-2 kubenswrapper[4762]: I1014 13:15:39.897240 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:39.897319 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:39.897319 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:39.897319 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:39.898058 master-2 kubenswrapper[4762]: I1014 13:15:39.897339 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:40.120680 master-2 kubenswrapper[4762]: I1014 13:15:40.120623 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:15:40.897236 master-2 kubenswrapper[4762]: I1014 13:15:40.897166 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:40.897236 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:40.897236 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:40.897236 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:40.897846 master-2 kubenswrapper[4762]: I1014 13:15:40.897246 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:41.897930 master-2 kubenswrapper[4762]: I1014 13:15:41.897841 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:41.897930 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:41.897930 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:41.897930 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:41.899006 master-2 kubenswrapper[4762]: I1014 13:15:41.897938 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:42.897625 master-2 kubenswrapper[4762]: I1014 13:15:42.897515 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:42.897625 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:42.897625 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:42.897625 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:42.898818 master-2 kubenswrapper[4762]: I1014 13:15:42.897671 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:43.897264 master-2 kubenswrapper[4762]: I1014 13:15:43.897093 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:43.897264 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:43.897264 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:43.897264 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:43.897264 master-2 kubenswrapper[4762]: I1014 13:15:43.897226 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:44.897255 master-2 kubenswrapper[4762]: I1014 13:15:44.897151 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:44.897255 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:44.897255 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:44.897255 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:44.897814 master-2 kubenswrapper[4762]: I1014 13:15:44.897294 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: I1014 13:15:45.634679 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:15:45.634769 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:15:45.635911 master-2 kubenswrapper[4762]: I1014 13:15:45.634783 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:45.897481 master-2 kubenswrapper[4762]: I1014 13:15:45.897336 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:45.897481 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:45.897481 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:45.897481 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:45.897481 master-2 kubenswrapper[4762]: I1014 13:15:45.897430 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:46.897463 master-2 kubenswrapper[4762]: I1014 13:15:46.897375 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:46.897463 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:46.897463 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:46.897463 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:46.898414 master-2 kubenswrapper[4762]: I1014 13:15:46.897482 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:47.897845 master-2 kubenswrapper[4762]: I1014 13:15:47.897768 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:47.897845 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:47.897845 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:47.897845 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:47.898796 master-2 kubenswrapper[4762]: I1014 13:15:47.897867 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:48.897526 master-2 kubenswrapper[4762]: I1014 13:15:48.897450 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:48.897526 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:48.897526 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:48.897526 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:48.898550 master-2 kubenswrapper[4762]: I1014 13:15:48.897539 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:49.898443 master-2 kubenswrapper[4762]: I1014 13:15:49.898354 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:49.898443 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:49.898443 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:49.898443 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:49.899585 master-2 kubenswrapper[4762]: I1014 13:15:49.898460 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:50.897233 master-2 kubenswrapper[4762]: I1014 13:15:50.897142 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:50.897233 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:50.897233 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:50.897233 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:50.897866 master-2 kubenswrapper[4762]: I1014 13:15:50.897244 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:51.898040 master-2 kubenswrapper[4762]: I1014 13:15:51.897922 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:51.898040 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:51.898040 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:51.898040 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:51.898040 master-2 kubenswrapper[4762]: I1014 13:15:51.898016 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:52.897661 master-2 kubenswrapper[4762]: I1014 13:15:52.897574 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:52.897661 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:52.897661 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:52.897661 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:52.898112 master-2 kubenswrapper[4762]: I1014 13:15:52.897679 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:53.897492 master-2 kubenswrapper[4762]: I1014 13:15:53.897438 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:53.897492 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:53.897492 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:53.897492 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:53.898025 master-2 kubenswrapper[4762]: I1014 13:15:53.897982 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:54.898091 master-2 kubenswrapper[4762]: I1014 13:15:54.898006 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:54.898091 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:54.898091 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:54.898091 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:54.899285 master-2 kubenswrapper[4762]: I1014 13:15:54.898099 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:55.897071 master-2 kubenswrapper[4762]: I1014 13:15:55.897007 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:55.897071 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:55.897071 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:55.897071 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:55.897433 master-2 kubenswrapper[4762]: I1014 13:15:55.897078 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:56.898044 master-2 kubenswrapper[4762]: I1014 13:15:56.897970 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:56.898044 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:56.898044 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:56.898044 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:56.899313 master-2 kubenswrapper[4762]: I1014 13:15:56.899086 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:57.901147 master-2 kubenswrapper[4762]: I1014 13:15:57.901062 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:57.901147 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:57.901147 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:57.901147 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:57.901844 master-2 kubenswrapper[4762]: I1014 13:15:57.901187 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:58.897425 master-2 kubenswrapper[4762]: I1014 13:15:58.897337 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:58.897425 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:58.897425 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:58.897425 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:58.897917 master-2 kubenswrapper[4762]: I1014 13:15:58.897490 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:15:59.897548 master-2 kubenswrapper[4762]: I1014 13:15:59.897420 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:15:59.897548 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:15:59.897548 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:15:59.897548 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:15:59.897548 master-2 kubenswrapper[4762]: I1014 13:15:59.897528 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:00.897545 master-2 kubenswrapper[4762]: I1014 13:16:00.897449 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:00.897545 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:00.897545 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:00.897545 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:00.897545 master-2 kubenswrapper[4762]: I1014 13:16:00.897554 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:01.897768 master-2 kubenswrapper[4762]: I1014 13:16:01.897657 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:01.897768 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:01.897768 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:01.897768 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:01.898731 master-2 kubenswrapper[4762]: I1014 13:16:01.897775 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:02.897863 master-2 kubenswrapper[4762]: I1014 13:16:02.897780 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:02.897863 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:02.897863 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:02.897863 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:02.899141 master-2 kubenswrapper[4762]: I1014 13:16:02.897886 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:03.897533 master-2 kubenswrapper[4762]: I1014 13:16:03.897451 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:03.897533 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:03.897533 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:03.897533 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:03.897895 master-2 kubenswrapper[4762]: I1014 13:16:03.897556 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:04.897814 master-2 kubenswrapper[4762]: I1014 13:16:04.897673 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:04.897814 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:04.897814 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:04.897814 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:04.897814 master-2 kubenswrapper[4762]: I1014 13:16:04.897791 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: I1014 13:16:05.635687 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:16:05.635801 master-2 kubenswrapper[4762]: I1014 13:16:05.635785 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:05.773932 master-2 kubenswrapper[4762]: E1014 13:16:05.773877 4762 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/etcd-pod.yaml\": /etc/kubernetes/manifests/etcd-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Oct 14 13:16:05.774578 master-2 kubenswrapper[4762]: I1014 13:16:05.774518 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:16:05.775418 master-2 kubenswrapper[4762]: I1014 13:16:05.775369 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcdctl" containerID="cri-o://b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" gracePeriod=30 Oct 14 13:16:05.775609 master-2 kubenswrapper[4762]: I1014 13:16:05.775371 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" containerID="cri-o://e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" gracePeriod=30 Oct 14 13:16:05.775737 master-2 kubenswrapper[4762]: I1014 13:16:05.775427 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-readyz" containerID="cri-o://044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" gracePeriod=30 Oct 14 13:16:05.775790 master-2 kubenswrapper[4762]: I1014 13:16:05.775445 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-rev" containerID="cri-o://0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" gracePeriod=30 Oct 14 13:16:05.775790 master-2 kubenswrapper[4762]: I1014 13:16:05.775458 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-metrics" containerID="cri-o://930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" gracePeriod=30 Oct 14 13:16:05.778306 master-2 kubenswrapper[4762]: I1014 13:16:05.778207 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:16:05.778628 master-2 kubenswrapper[4762]: E1014 13:16:05.778585 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-resources-copy" Oct 14 13:16:05.778628 master-2 kubenswrapper[4762]: I1014 13:16:05.778610 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-resources-copy" Oct 14 13:16:05.778628 master-2 kubenswrapper[4762]: E1014 13:16:05.778622 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-ensure-env-vars" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778630 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-ensure-env-vars" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778669 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcdctl" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778677 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcdctl" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778684 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="setup" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778691 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="setup" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778699 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-metrics" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778705 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-metrics" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778729 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778739 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778750 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778755 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778765 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778771 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778778 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-readyz" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778784 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-readyz" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: E1014 13:16:05.778809 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-rev" Oct 14 13:16:05.778820 master-2 kubenswrapper[4762]: I1014 13:16:05.778816 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-rev" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778927 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778939 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcdctl" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778963 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-readyz" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778973 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-rev" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778983 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd-metrics" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.778992 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.780684 master-2 kubenswrapper[4762]: I1014 13:16:05.779218 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="c492168afa20f49cb6e3534e1871011b" containerName="etcd" Oct 14 13:16:05.898143 master-2 kubenswrapper[4762]: I1014 13:16:05.898058 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:05.898330 master-2 kubenswrapper[4762]: I1014 13:16:05.898236 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:05.898330 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:05.898330 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:05.898330 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:05.899025 master-2 kubenswrapper[4762]: I1014 13:16:05.898251 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:05.899025 master-2 kubenswrapper[4762]: I1014 13:16:05.898346 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:05.899025 master-2 kubenswrapper[4762]: I1014 13:16:05.898408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:05.899749 master-2 kubenswrapper[4762]: I1014 13:16:05.899435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:05.899749 master-2 kubenswrapper[4762]: I1014 13:16:05.899642 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:05.899749 master-2 kubenswrapper[4762]: I1014 13:16:05.899703 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001412 master-2 kubenswrapper[4762]: I1014 13:16:06.001334 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001587 master-2 kubenswrapper[4762]: I1014 13:16:06.001420 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001587 master-2 kubenswrapper[4762]: I1014 13:16:06.001469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001587 master-2 kubenswrapper[4762]: I1014 13:16:06.001539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001715 master-2 kubenswrapper[4762]: I1014 13:16:06.001587 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001715 master-2 kubenswrapper[4762]: I1014 13:16:06.001624 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001792 master-2 kubenswrapper[4762]: I1014 13:16:06.001749 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001835 master-2 kubenswrapper[4762]: I1014 13:16:06.001817 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001881 master-2 kubenswrapper[4762]: I1014 13:16:06.001860 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001924 master-2 kubenswrapper[4762]: I1014 13:16:06.001900 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.001968 master-2 kubenswrapper[4762]: I1014 13:16:06.001940 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.002010 master-2 kubenswrapper[4762]: I1014 13:16:06.001981 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir\") pod \"etcd-master-2\" (UID: \"2c4a583adfee975da84510940117e71a\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:16:06.382874 master-2 kubenswrapper[4762]: I1014 13:16:06.382826 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:16:06.384386 master-2 kubenswrapper[4762]: I1014 13:16:06.384204 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-rev/0.log" Oct 14 13:16:06.386010 master-2 kubenswrapper[4762]: I1014 13:16:06.385959 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-metrics/0.log" Oct 14 13:16:06.388396 master-2 kubenswrapper[4762]: I1014 13:16:06.388343 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" exitCode=2 Oct 14 13:16:06.388396 master-2 kubenswrapper[4762]: I1014 13:16:06.388384 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" exitCode=0 Oct 14 13:16:06.388591 master-2 kubenswrapper[4762]: I1014 13:16:06.388406 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" exitCode=2 Oct 14 13:16:06.391225 master-2 kubenswrapper[4762]: I1014 13:16:06.391119 4762 generic.go:334] "Generic (PLEG): container finished" podID="08b313e4-ea57-4f9c-ad72-1f640ef21c52" containerID="d7d08b38f7e0af5214dbb27e0ea5e13ef41ff6f6d36bc1e8272e35547c0ac516" exitCode=0 Oct 14 13:16:06.391225 master-2 kubenswrapper[4762]: I1014 13:16:06.391193 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-5-master-2" event={"ID":"08b313e4-ea57-4f9c-ad72-1f640ef21c52","Type":"ContainerDied","Data":"d7d08b38f7e0af5214dbb27e0ea5e13ef41ff6f6d36bc1e8272e35547c0ac516"} Oct 14 13:16:06.403621 master-2 kubenswrapper[4762]: I1014 13:16:06.403526 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="c492168afa20f49cb6e3534e1871011b" podUID="2c4a583adfee975da84510940117e71a" Oct 14 13:16:06.897479 master-2 kubenswrapper[4762]: I1014 13:16:06.897388 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:06.897479 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:06.897479 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:06.897479 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:06.897835 master-2 kubenswrapper[4762]: I1014 13:16:06.897488 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:07.770245 master-2 kubenswrapper[4762]: I1014 13:16:07.770184 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-2" Oct 14 13:16:07.897282 master-2 kubenswrapper[4762]: I1014 13:16:07.897209 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:07.897282 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:07.897282 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:07.897282 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:07.897827 master-2 kubenswrapper[4762]: I1014 13:16:07.897315 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:07.928967 master-2 kubenswrapper[4762]: I1014 13:16:07.928853 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir\") pod \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " Oct 14 13:16:07.928967 master-2 kubenswrapper[4762]: I1014 13:16:07.928936 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access\") pod \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " Oct 14 13:16:07.929405 master-2 kubenswrapper[4762]: I1014 13:16:07.928975 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "08b313e4-ea57-4f9c-ad72-1f640ef21c52" (UID: "08b313e4-ea57-4f9c-ad72-1f640ef21c52"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:07.929405 master-2 kubenswrapper[4762]: I1014 13:16:07.929040 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock\") pod \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\" (UID: \"08b313e4-ea57-4f9c-ad72-1f640ef21c52\") " Oct 14 13:16:07.929405 master-2 kubenswrapper[4762]: I1014 13:16:07.929324 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock" (OuterVolumeSpecName: "var-lock") pod "08b313e4-ea57-4f9c-ad72-1f640ef21c52" (UID: "08b313e4-ea57-4f9c-ad72-1f640ef21c52"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:07.929405 master-2 kubenswrapper[4762]: I1014 13:16:07.929384 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:07.932975 master-2 kubenswrapper[4762]: I1014 13:16:07.932926 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "08b313e4-ea57-4f9c-ad72-1f640ef21c52" (UID: "08b313e4-ea57-4f9c-ad72-1f640ef21c52"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:16:08.030542 master-2 kubenswrapper[4762]: I1014 13:16:08.030347 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/08b313e4-ea57-4f9c-ad72-1f640ef21c52-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:08.030542 master-2 kubenswrapper[4762]: I1014 13:16:08.030413 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08b313e4-ea57-4f9c-ad72-1f640ef21c52-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:08.404247 master-2 kubenswrapper[4762]: I1014 13:16:08.403970 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-5-master-2" event={"ID":"08b313e4-ea57-4f9c-ad72-1f640ef21c52","Type":"ContainerDied","Data":"0f8d69fafa1a065a947cd683b9b7fd0885809880410ea1e61811e6730c8bbe87"} Oct 14 13:16:08.404247 master-2 kubenswrapper[4762]: I1014 13:16:08.404022 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8d69fafa1a065a947cd683b9b7fd0885809880410ea1e61811e6730c8bbe87" Oct 14 13:16:08.404984 master-2 kubenswrapper[4762]: I1014 13:16:08.404951 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-5-master-2" Oct 14 13:16:08.897945 master-2 kubenswrapper[4762]: I1014 13:16:08.897862 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:08.897945 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:08.897945 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:08.897945 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:08.898921 master-2 kubenswrapper[4762]: I1014 13:16:08.897949 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:09.897797 master-2 kubenswrapper[4762]: I1014 13:16:09.897699 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:09.897797 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:09.897797 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:09.897797 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:09.898396 master-2 kubenswrapper[4762]: I1014 13:16:09.897842 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:10.118007 master-2 kubenswrapper[4762]: I1014 13:16:10.117947 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:16:10.335212 master-2 kubenswrapper[4762]: I1014 13:16:10.335103 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:10.335498 master-2 kubenswrapper[4762]: I1014 13:16:10.335253 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:10.897762 master-2 kubenswrapper[4762]: I1014 13:16:10.897678 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:10.897762 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:10.897762 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:10.897762 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:10.898794 master-2 kubenswrapper[4762]: I1014 13:16:10.897780 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:11.897917 master-2 kubenswrapper[4762]: I1014 13:16:11.897738 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:11.897917 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:11.897917 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:11.897917 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:11.897917 master-2 kubenswrapper[4762]: I1014 13:16:11.897852 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:12.897508 master-2 kubenswrapper[4762]: I1014 13:16:12.897407 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:12.897508 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:12.897508 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:12.897508 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:12.900001 master-2 kubenswrapper[4762]: I1014 13:16:12.897528 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:13.898238 master-2 kubenswrapper[4762]: I1014 13:16:13.898127 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:13.898238 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:13.898238 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:13.898238 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:13.898883 master-2 kubenswrapper[4762]: I1014 13:16:13.898246 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:14.897865 master-2 kubenswrapper[4762]: I1014 13:16:14.897785 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:14.897865 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:14.897865 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:14.897865 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:14.899034 master-2 kubenswrapper[4762]: I1014 13:16:14.897905 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:15.335346 master-2 kubenswrapper[4762]: I1014 13:16:15.335280 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:15.335652 master-2 kubenswrapper[4762]: I1014 13:16:15.335364 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:15.897980 master-2 kubenswrapper[4762]: I1014 13:16:15.897886 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:15.897980 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:15.897980 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:15.897980 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:15.898959 master-2 kubenswrapper[4762]: I1014 13:16:15.897980 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:16.897769 master-2 kubenswrapper[4762]: I1014 13:16:16.897667 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:16.897769 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:16.897769 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:16.897769 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:16.898550 master-2 kubenswrapper[4762]: I1014 13:16:16.897792 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:17.897864 master-2 kubenswrapper[4762]: I1014 13:16:17.897751 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:17.897864 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:17.897864 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:17.897864 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:17.898697 master-2 kubenswrapper[4762]: I1014 13:16:17.897871 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:18.897667 master-2 kubenswrapper[4762]: I1014 13:16:18.897581 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:18.897667 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:18.897667 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:18.897667 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:18.898047 master-2 kubenswrapper[4762]: I1014 13:16:18.897688 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:19.897761 master-2 kubenswrapper[4762]: I1014 13:16:19.897608 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:19.897761 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:19.897761 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:19.897761 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:19.899034 master-2 kubenswrapper[4762]: I1014 13:16:19.897798 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:20.335478 master-2 kubenswrapper[4762]: I1014 13:16:20.335350 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:20.335478 master-2 kubenswrapper[4762]: I1014 13:16:20.335484 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:20.335894 master-2 kubenswrapper[4762]: I1014 13:16:20.335594 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:16:20.337337 master-2 kubenswrapper[4762]: I1014 13:16:20.337297 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:20.337387 master-2 kubenswrapper[4762]: I1014 13:16:20.337344 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:20.897449 master-2 kubenswrapper[4762]: I1014 13:16:20.897367 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:20.897449 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:20.897449 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:20.897449 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:20.897757 master-2 kubenswrapper[4762]: I1014 13:16:20.897465 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:21.897744 master-2 kubenswrapper[4762]: I1014 13:16:21.897664 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:21.897744 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:21.897744 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:21.897744 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:21.898778 master-2 kubenswrapper[4762]: I1014 13:16:21.898445 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:22.898558 master-2 kubenswrapper[4762]: I1014 13:16:22.898446 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:22.898558 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:22.898558 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:22.898558 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:22.898558 master-2 kubenswrapper[4762]: I1014 13:16:22.898536 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:23.896988 master-2 kubenswrapper[4762]: I1014 13:16:23.896918 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:23.896988 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:23.896988 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:23.896988 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:23.897397 master-2 kubenswrapper[4762]: I1014 13:16:23.896993 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:24.898093 master-2 kubenswrapper[4762]: I1014 13:16:24.897989 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:24.898093 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:24.898093 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:24.898093 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:24.899053 master-2 kubenswrapper[4762]: I1014 13:16:24.898097 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:25.335666 master-2 kubenswrapper[4762]: I1014 13:16:25.335587 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:25.336209 master-2 kubenswrapper[4762]: I1014 13:16:25.335681 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: I1014 13:16:25.633399 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:16:25.633562 master-2 kubenswrapper[4762]: I1014 13:16:25.633484 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:25.896935 master-2 kubenswrapper[4762]: I1014 13:16:25.896760 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:25.896935 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:25.896935 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:25.896935 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:25.896935 master-2 kubenswrapper[4762]: I1014 13:16:25.896822 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:26.897067 master-2 kubenswrapper[4762]: I1014 13:16:26.896984 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:26.897067 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:26.897067 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:26.897067 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:26.897822 master-2 kubenswrapper[4762]: I1014 13:16:26.897076 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:27.896439 master-2 kubenswrapper[4762]: I1014 13:16:27.896342 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:27.896439 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:27.896439 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:27.896439 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:27.897824 master-2 kubenswrapper[4762]: I1014 13:16:27.896459 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:28.898548 master-2 kubenswrapper[4762]: I1014 13:16:28.898452 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:28.898548 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:28.898548 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:28.898548 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:28.899919 master-2 kubenswrapper[4762]: I1014 13:16:28.898553 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:29.898696 master-2 kubenswrapper[4762]: I1014 13:16:29.898596 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:29.898696 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:29.898696 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:29.898696 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:29.898696 master-2 kubenswrapper[4762]: I1014 13:16:29.898693 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:30.335853 master-2 kubenswrapper[4762]: I1014 13:16:30.335784 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:30.336341 master-2 kubenswrapper[4762]: I1014 13:16:30.336291 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:30.898340 master-2 kubenswrapper[4762]: I1014 13:16:30.898259 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:30.898340 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:30.898340 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:30.898340 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:30.899545 master-2 kubenswrapper[4762]: I1014 13:16:30.898359 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:31.898073 master-2 kubenswrapper[4762]: I1014 13:16:31.897995 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:31.898073 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:31.898073 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:31.898073 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:31.898516 master-2 kubenswrapper[4762]: I1014 13:16:31.898094 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:32.897367 master-2 kubenswrapper[4762]: I1014 13:16:32.897242 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:32.897367 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:32.897367 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:32.897367 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:32.897367 master-2 kubenswrapper[4762]: I1014 13:16:32.897340 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:33.897224 master-2 kubenswrapper[4762]: I1014 13:16:33.897170 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:33.897224 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:33.897224 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:33.897224 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:33.897822 master-2 kubenswrapper[4762]: I1014 13:16:33.897254 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:34.897803 master-2 kubenswrapper[4762]: I1014 13:16:34.897696 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:34.897803 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:34.897803 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:34.897803 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:34.898837 master-2 kubenswrapper[4762]: I1014 13:16:34.897805 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:35.335217 master-2 kubenswrapper[4762]: I1014 13:16:35.335103 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:35.335523 master-2 kubenswrapper[4762]: I1014 13:16:35.335243 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:35.897720 master-2 kubenswrapper[4762]: I1014 13:16:35.897638 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:35.897720 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:35.897720 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:35.897720 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:35.897720 master-2 kubenswrapper[4762]: I1014 13:16:35.897718 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:36.344115 master-2 kubenswrapper[4762]: I1014 13:16:36.344048 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/2.log" Oct 14 13:16:36.345072 master-2 kubenswrapper[4762]: I1014 13:16:36.345022 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:16:36.346225 master-2 kubenswrapper[4762]: I1014 13:16:36.346139 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-rev/0.log" Oct 14 13:16:36.347728 master-2 kubenswrapper[4762]: I1014 13:16:36.347685 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-metrics/0.log" Oct 14 13:16:36.348532 master-2 kubenswrapper[4762]: I1014 13:16:36.348497 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcdctl/0.log" Oct 14 13:16:36.350332 master-2 kubenswrapper[4762]: I1014 13:16:36.350289 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:36.357465 master-2 kubenswrapper[4762]: I1014 13:16:36.357401 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="c492168afa20f49cb6e3534e1871011b" podUID="2c4a583adfee975da84510940117e71a" Oct 14 13:16:36.537687 master-2 kubenswrapper[4762]: I1014 13:16:36.537582 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.537687 master-2 kubenswrapper[4762]: I1014 13:16:36.537657 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537722 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537760 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537751 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537772 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir" (OuterVolumeSpecName: "data-dir") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537798 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537814 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin\") pod \"c492168afa20f49cb6e3534e1871011b\" (UID: \"c492168afa20f49cb6e3534e1871011b\") " Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537907 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537906 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir" (OuterVolumeSpecName: "log-dir") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537931 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538114 master-2 kubenswrapper[4762]: I1014 13:16:36.537993 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "c492168afa20f49cb6e3534e1871011b" (UID: "c492168afa20f49cb6e3534e1871011b"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538405 4762 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-log-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538440 4762 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-static-pod-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538466 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538533 4762 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-usr-local-bin\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538558 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.538892 master-2 kubenswrapper[4762]: I1014 13:16:36.538580 4762 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c492168afa20f49cb6e3534e1871011b-data-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:16:36.594624 master-2 kubenswrapper[4762]: I1014 13:16:36.594466 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/2.log" Oct 14 13:16:36.595237 master-2 kubenswrapper[4762]: I1014 13:16:36.595201 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd/1.log" Oct 14 13:16:36.595919 master-2 kubenswrapper[4762]: I1014 13:16:36.595874 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-rev/0.log" Oct 14 13:16:36.597242 master-2 kubenswrapper[4762]: I1014 13:16:36.597198 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcd-metrics/0.log" Oct 14 13:16:36.597974 master-2 kubenswrapper[4762]: I1014 13:16:36.597915 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_c492168afa20f49cb6e3534e1871011b/etcdctl/0.log" Oct 14 13:16:36.599455 master-2 kubenswrapper[4762]: I1014 13:16:36.599384 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" exitCode=137 Oct 14 13:16:36.599455 master-2 kubenswrapper[4762]: I1014 13:16:36.599448 4762 generic.go:334] "Generic (PLEG): container finished" podID="c492168afa20f49cb6e3534e1871011b" containerID="b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" exitCode=137 Oct 14 13:16:36.599624 master-2 kubenswrapper[4762]: I1014 13:16:36.599479 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:36.599624 master-2 kubenswrapper[4762]: I1014 13:16:36.599489 4762 scope.go:117] "RemoveContainer" containerID="e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" Oct 14 13:16:36.605942 master-2 kubenswrapper[4762]: I1014 13:16:36.605898 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="c492168afa20f49cb6e3534e1871011b" podUID="2c4a583adfee975da84510940117e71a" Oct 14 13:16:36.616132 master-2 kubenswrapper[4762]: I1014 13:16:36.616089 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:16:36.620065 master-2 kubenswrapper[4762]: I1014 13:16:36.620003 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="c492168afa20f49cb6e3534e1871011b" podUID="2c4a583adfee975da84510940117e71a" Oct 14 13:16:36.653497 master-2 kubenswrapper[4762]: I1014 13:16:36.653453 4762 scope.go:117] "RemoveContainer" containerID="0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" Oct 14 13:16:36.667729 master-2 kubenswrapper[4762]: I1014 13:16:36.667706 4762 scope.go:117] "RemoveContainer" containerID="044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" Oct 14 13:16:36.689246 master-2 kubenswrapper[4762]: I1014 13:16:36.689207 4762 scope.go:117] "RemoveContainer" containerID="930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" Oct 14 13:16:36.708006 master-2 kubenswrapper[4762]: I1014 13:16:36.707951 4762 scope.go:117] "RemoveContainer" containerID="b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" Oct 14 13:16:36.722544 master-2 kubenswrapper[4762]: I1014 13:16:36.722497 4762 scope.go:117] "RemoveContainer" containerID="1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33" Oct 14 13:16:36.746252 master-2 kubenswrapper[4762]: I1014 13:16:36.746184 4762 scope.go:117] "RemoveContainer" containerID="ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d" Oct 14 13:16:36.768848 master-2 kubenswrapper[4762]: I1014 13:16:36.768779 4762 scope.go:117] "RemoveContainer" containerID="6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8" Oct 14 13:16:36.797103 master-2 kubenswrapper[4762]: I1014 13:16:36.796871 4762 scope.go:117] "RemoveContainer" containerID="e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" Oct 14 13:16:36.797678 master-2 kubenswrapper[4762]: E1014 13:16:36.797622 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de\": container with ID starting with e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de not found: ID does not exist" containerID="e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" Oct 14 13:16:36.797750 master-2 kubenswrapper[4762]: I1014 13:16:36.797679 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de"} err="failed to get container status \"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de\": rpc error: code = NotFound desc = could not find container \"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de\": container with ID starting with e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de not found: ID does not exist" Oct 14 13:16:36.797750 master-2 kubenswrapper[4762]: I1014 13:16:36.797714 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:16:36.798449 master-2 kubenswrapper[4762]: E1014 13:16:36.798386 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410\": container with ID starting with aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410 not found: ID does not exist" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:16:36.798532 master-2 kubenswrapper[4762]: I1014 13:16:36.798463 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410"} err="failed to get container status \"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410\": rpc error: code = NotFound desc = could not find container \"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410\": container with ID starting with aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410 not found: ID does not exist" Oct 14 13:16:36.798532 master-2 kubenswrapper[4762]: I1014 13:16:36.798510 4762 scope.go:117] "RemoveContainer" containerID="0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" Oct 14 13:16:36.799027 master-2 kubenswrapper[4762]: E1014 13:16:36.798974 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546\": container with ID starting with 0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546 not found: ID does not exist" containerID="0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" Oct 14 13:16:36.799104 master-2 kubenswrapper[4762]: I1014 13:16:36.799015 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546"} err="failed to get container status \"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546\": rpc error: code = NotFound desc = could not find container \"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546\": container with ID starting with 0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546 not found: ID does not exist" Oct 14 13:16:36.799104 master-2 kubenswrapper[4762]: I1014 13:16:36.799046 4762 scope.go:117] "RemoveContainer" containerID="044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" Oct 14 13:16:36.799730 master-2 kubenswrapper[4762]: E1014 13:16:36.799652 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6\": container with ID starting with 044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6 not found: ID does not exist" containerID="044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" Oct 14 13:16:36.799730 master-2 kubenswrapper[4762]: I1014 13:16:36.799700 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6"} err="failed to get container status \"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6\": rpc error: code = NotFound desc = could not find container \"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6\": container with ID starting with 044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6 not found: ID does not exist" Oct 14 13:16:36.799730 master-2 kubenswrapper[4762]: I1014 13:16:36.799731 4762 scope.go:117] "RemoveContainer" containerID="930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" Oct 14 13:16:36.800255 master-2 kubenswrapper[4762]: E1014 13:16:36.800194 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951\": container with ID starting with 930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951 not found: ID does not exist" containerID="930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" Oct 14 13:16:36.800329 master-2 kubenswrapper[4762]: I1014 13:16:36.800264 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951"} err="failed to get container status \"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951\": rpc error: code = NotFound desc = could not find container \"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951\": container with ID starting with 930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951 not found: ID does not exist" Oct 14 13:16:36.800329 master-2 kubenswrapper[4762]: I1014 13:16:36.800309 4762 scope.go:117] "RemoveContainer" containerID="b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" Oct 14 13:16:36.800693 master-2 kubenswrapper[4762]: E1014 13:16:36.800659 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e\": container with ID starting with b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e not found: ID does not exist" containerID="b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" Oct 14 13:16:36.800756 master-2 kubenswrapper[4762]: I1014 13:16:36.800689 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e"} err="failed to get container status \"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e\": rpc error: code = NotFound desc = could not find container \"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e\": container with ID starting with b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e not found: ID does not exist" Oct 14 13:16:36.800756 master-2 kubenswrapper[4762]: I1014 13:16:36.800715 4762 scope.go:117] "RemoveContainer" containerID="1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33" Oct 14 13:16:36.801120 master-2 kubenswrapper[4762]: E1014 13:16:36.801085 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33\": container with ID starting with 1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33 not found: ID does not exist" containerID="1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33" Oct 14 13:16:36.801193 master-2 kubenswrapper[4762]: I1014 13:16:36.801118 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33"} err="failed to get container status \"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33\": rpc error: code = NotFound desc = could not find container \"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33\": container with ID starting with 1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33 not found: ID does not exist" Oct 14 13:16:36.801193 master-2 kubenswrapper[4762]: I1014 13:16:36.801136 4762 scope.go:117] "RemoveContainer" containerID="ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d" Oct 14 13:16:36.801550 master-2 kubenswrapper[4762]: E1014 13:16:36.801506 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d\": container with ID starting with ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d not found: ID does not exist" containerID="ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d" Oct 14 13:16:36.801592 master-2 kubenswrapper[4762]: I1014 13:16:36.801551 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d"} err="failed to get container status \"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d\": rpc error: code = NotFound desc = could not find container \"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d\": container with ID starting with ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d not found: ID does not exist" Oct 14 13:16:36.801632 master-2 kubenswrapper[4762]: I1014 13:16:36.801591 4762 scope.go:117] "RemoveContainer" containerID="6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8" Oct 14 13:16:36.802129 master-2 kubenswrapper[4762]: E1014 13:16:36.802087 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8\": container with ID starting with 6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8 not found: ID does not exist" containerID="6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8" Oct 14 13:16:36.802183 master-2 kubenswrapper[4762]: I1014 13:16:36.802130 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8"} err="failed to get container status \"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8\": rpc error: code = NotFound desc = could not find container \"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8\": container with ID starting with 6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8 not found: ID does not exist" Oct 14 13:16:36.802225 master-2 kubenswrapper[4762]: I1014 13:16:36.802194 4762 scope.go:117] "RemoveContainer" containerID="e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de" Oct 14 13:16:36.802719 master-2 kubenswrapper[4762]: I1014 13:16:36.802656 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de"} err="failed to get container status \"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de\": rpc error: code = NotFound desc = could not find container \"e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de\": container with ID starting with e17495a82cef8a0676e553bdfafdc70409c48a143d0e5a81d56563bbbf2398de not found: ID does not exist" Oct 14 13:16:36.802772 master-2 kubenswrapper[4762]: I1014 13:16:36.802720 4762 scope.go:117] "RemoveContainer" containerID="aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410" Oct 14 13:16:36.803433 master-2 kubenswrapper[4762]: I1014 13:16:36.803397 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410"} err="failed to get container status \"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410\": rpc error: code = NotFound desc = could not find container \"aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410\": container with ID starting with aa231197e999c93a027edb37f0103cfe346767f572cbabd1136b388c9b713410 not found: ID does not exist" Oct 14 13:16:36.803433 master-2 kubenswrapper[4762]: I1014 13:16:36.803428 4762 scope.go:117] "RemoveContainer" containerID="0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546" Oct 14 13:16:36.803815 master-2 kubenswrapper[4762]: I1014 13:16:36.803773 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546"} err="failed to get container status \"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546\": rpc error: code = NotFound desc = could not find container \"0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546\": container with ID starting with 0a151f0bd5c6b606b6c89b6fbad470522251712b8d760adb739567543b3f9546 not found: ID does not exist" Oct 14 13:16:36.803815 master-2 kubenswrapper[4762]: I1014 13:16:36.803806 4762 scope.go:117] "RemoveContainer" containerID="044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6" Oct 14 13:16:36.804182 master-2 kubenswrapper[4762]: I1014 13:16:36.804134 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6"} err="failed to get container status \"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6\": rpc error: code = NotFound desc = could not find container \"044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6\": container with ID starting with 044edff9cb383dac0f2c295e942c0b56543f1ba69662c248a25039887f1d82e6 not found: ID does not exist" Oct 14 13:16:36.804228 master-2 kubenswrapper[4762]: I1014 13:16:36.804182 4762 scope.go:117] "RemoveContainer" containerID="930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951" Oct 14 13:16:36.804590 master-2 kubenswrapper[4762]: I1014 13:16:36.804542 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951"} err="failed to get container status \"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951\": rpc error: code = NotFound desc = could not find container \"930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951\": container with ID starting with 930613d268dd89eef6d12b249532ff59c399cd2934bfb107eedf634a2b26d951 not found: ID does not exist" Oct 14 13:16:36.804625 master-2 kubenswrapper[4762]: I1014 13:16:36.804588 4762 scope.go:117] "RemoveContainer" containerID="b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e" Oct 14 13:16:36.804926 master-2 kubenswrapper[4762]: I1014 13:16:36.804895 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e"} err="failed to get container status \"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e\": rpc error: code = NotFound desc = could not find container \"b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e\": container with ID starting with b2027546f6685cc4b4558ae050637f0978f42f16a38ca2c9891560a82362bb4e not found: ID does not exist" Oct 14 13:16:36.804963 master-2 kubenswrapper[4762]: I1014 13:16:36.804924 4762 scope.go:117] "RemoveContainer" containerID="1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33" Oct 14 13:16:36.805298 master-2 kubenswrapper[4762]: I1014 13:16:36.805257 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33"} err="failed to get container status \"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33\": rpc error: code = NotFound desc = could not find container \"1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33\": container with ID starting with 1d4ec18cbc894594194c960ffa7eb17f08b8db4618a40edcbdb36c81a2f3ce33 not found: ID does not exist" Oct 14 13:16:36.805346 master-2 kubenswrapper[4762]: I1014 13:16:36.805295 4762 scope.go:117] "RemoveContainer" containerID="ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d" Oct 14 13:16:36.805627 master-2 kubenswrapper[4762]: I1014 13:16:36.805594 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d"} err="failed to get container status \"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d\": rpc error: code = NotFound desc = could not find container \"ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d\": container with ID starting with ed2a09b05055bd0247fa03bd690d3c3a00d03179e7fd8aaba739e3ec5e7a8c0d not found: ID does not exist" Oct 14 13:16:36.805627 master-2 kubenswrapper[4762]: I1014 13:16:36.805622 4762 scope.go:117] "RemoveContainer" containerID="6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8" Oct 14 13:16:36.805926 master-2 kubenswrapper[4762]: I1014 13:16:36.805887 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8"} err="failed to get container status \"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8\": rpc error: code = NotFound desc = could not find container \"6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8\": container with ID starting with 6a5de749a5209ba1ebd95f61636355d60f043a2e0f9a0db808ed245f031ddde8 not found: ID does not exist" Oct 14 13:16:36.897375 master-2 kubenswrapper[4762]: I1014 13:16:36.897195 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:36.897375 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:36.897375 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:36.897375 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:36.897375 master-2 kubenswrapper[4762]: I1014 13:16:36.897268 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:37.555795 master-2 kubenswrapper[4762]: I1014 13:16:37.555713 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c492168afa20f49cb6e3534e1871011b" path="/var/lib/kubelet/pods/c492168afa20f49cb6e3534e1871011b/volumes" Oct 14 13:16:37.897607 master-2 kubenswrapper[4762]: I1014 13:16:37.897402 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:37.897607 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:37.897607 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:37.897607 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:37.897607 master-2 kubenswrapper[4762]: I1014 13:16:37.897505 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:38.898556 master-2 kubenswrapper[4762]: I1014 13:16:38.898458 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:38.898556 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:38.898556 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:38.898556 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:38.898556 master-2 kubenswrapper[4762]: I1014 13:16:38.898562 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:39.897239 master-2 kubenswrapper[4762]: I1014 13:16:39.897129 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:39.897239 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:39.897239 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:39.897239 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:39.897681 master-2 kubenswrapper[4762]: I1014 13:16:39.897249 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:40.112095 master-2 kubenswrapper[4762]: I1014 13:16:40.112028 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:16:40.335747 master-2 kubenswrapper[4762]: I1014 13:16:40.335614 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:40.335747 master-2 kubenswrapper[4762]: I1014 13:16:40.335713 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:40.896935 master-2 kubenswrapper[4762]: I1014 13:16:40.896860 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:40.896935 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:40.896935 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:40.896935 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:40.896935 master-2 kubenswrapper[4762]: I1014 13:16:40.896938 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:41.896983 master-2 kubenswrapper[4762]: I1014 13:16:41.896875 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:41.896983 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:41.896983 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:41.896983 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:41.896983 master-2 kubenswrapper[4762]: I1014 13:16:41.896960 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:42.897717 master-2 kubenswrapper[4762]: I1014 13:16:42.897574 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:42.897717 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:42.897717 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:42.897717 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:42.898878 master-2 kubenswrapper[4762]: I1014 13:16:42.897740 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:43.547909 master-2 kubenswrapper[4762]: I1014 13:16:43.547801 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:43.562699 master-2 kubenswrapper[4762]: I1014 13:16:43.562646 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-2" podUID="51ca3744-366b-4d71-82f1-f9aaab5c76ed" Oct 14 13:16:43.562699 master-2 kubenswrapper[4762]: I1014 13:16:43.562691 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-2" podUID="51ca3744-366b-4d71-82f1-f9aaab5c76ed" Oct 14 13:16:43.588122 master-2 kubenswrapper[4762]: I1014 13:16:43.588019 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:16:43.628277 master-2 kubenswrapper[4762]: I1014 13:16:43.627713 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:43.631977 master-2 kubenswrapper[4762]: I1014 13:16:43.631875 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:16:43.656682 master-2 kubenswrapper[4762]: I1014 13:16:43.656592 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:43.658434 master-2 kubenswrapper[4762]: I1014 13:16:43.658353 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:16:43.675405 master-2 kubenswrapper[4762]: W1014 13:16:43.675310 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4a583adfee975da84510940117e71a.slice/crio-ec125f118f63df32c914906ef0ee2142ce8816ec6225aab9314bd2be4d7dd1eb WatchSource:0}: Error finding container ec125f118f63df32c914906ef0ee2142ce8816ec6225aab9314bd2be4d7dd1eb: Status 404 returned error can't find the container with id ec125f118f63df32c914906ef0ee2142ce8816ec6225aab9314bd2be4d7dd1eb Oct 14 13:16:43.899128 master-2 kubenswrapper[4762]: I1014 13:16:43.899000 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:43.899128 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:43.899128 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:43.899128 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:43.899703 master-2 kubenswrapper[4762]: I1014 13:16:43.899126 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:44.652248 master-2 kubenswrapper[4762]: I1014 13:16:44.652128 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6" exitCode=0 Oct 14 13:16:44.652248 master-2 kubenswrapper[4762]: I1014 13:16:44.652233 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerDied","Data":"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6"} Oct 14 13:16:44.652631 master-2 kubenswrapper[4762]: I1014 13:16:44.652280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"ec125f118f63df32c914906ef0ee2142ce8816ec6225aab9314bd2be4d7dd1eb"} Oct 14 13:16:44.896318 master-2 kubenswrapper[4762]: I1014 13:16:44.896270 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:44.896318 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:44.896318 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:44.896318 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:44.896539 master-2 kubenswrapper[4762]: I1014 13:16:44.896360 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:45.335493 master-2 kubenswrapper[4762]: I1014 13:16:45.335318 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:16:45.335493 master-2 kubenswrapper[4762]: I1014 13:16:45.335397 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: I1014 13:16:45.633121 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:16:45.633280 master-2 kubenswrapper[4762]: I1014 13:16:45.633264 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:45.633787 master-2 kubenswrapper[4762]: I1014 13:16:45.633360 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:16:45.663230 master-2 kubenswrapper[4762]: I1014 13:16:45.663138 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a" exitCode=0 Oct 14 13:16:45.663230 master-2 kubenswrapper[4762]: I1014 13:16:45.663205 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerDied","Data":"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a"} Oct 14 13:16:45.897786 master-2 kubenswrapper[4762]: I1014 13:16:45.897629 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:45.897786 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:45.897786 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:45.897786 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:45.897786 master-2 kubenswrapper[4762]: I1014 13:16:45.897710 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:46.675795 master-2 kubenswrapper[4762]: I1014 13:16:46.675722 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d" exitCode=0 Oct 14 13:16:46.676742 master-2 kubenswrapper[4762]: I1014 13:16:46.675799 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerDied","Data":"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d"} Oct 14 13:16:46.897231 master-2 kubenswrapper[4762]: I1014 13:16:46.897124 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:46.897231 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:46.897231 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:46.897231 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:46.897582 master-2 kubenswrapper[4762]: I1014 13:16:46.897243 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:47.690411 master-2 kubenswrapper[4762]: I1014 13:16:47.690312 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690"} Oct 14 13:16:47.690411 master-2 kubenswrapper[4762]: I1014 13:16:47.690379 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59"} Oct 14 13:16:47.690411 master-2 kubenswrapper[4762]: I1014 13:16:47.690401 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d"} Oct 14 13:16:47.898390 master-2 kubenswrapper[4762]: I1014 13:16:47.897942 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:47.898390 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:47.898390 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:47.898390 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:47.898390 master-2 kubenswrapper[4762]: I1014 13:16:47.898024 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:48.701429 master-2 kubenswrapper[4762]: I1014 13:16:48.701367 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee"} Oct 14 13:16:48.701429 master-2 kubenswrapper[4762]: I1014 13:16:48.701426 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"2c4a583adfee975da84510940117e71a","Type":"ContainerStarted","Data":"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f"} Oct 14 13:16:48.743352 master-2 kubenswrapper[4762]: I1014 13:16:48.743225 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-2" podStartSLOduration=5.743207479 podStartE2EDuration="5.743207479s" podCreationTimestamp="2025-10-14 13:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:16:48.739941522 +0000 UTC m=+637.984100701" watchObservedRunningTime="2025-10-14 13:16:48.743207479 +0000 UTC m=+637.987366638" Oct 14 13:16:48.897810 master-2 kubenswrapper[4762]: I1014 13:16:48.897742 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:48.897810 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:48.897810 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:48.897810 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:48.898237 master-2 kubenswrapper[4762]: I1014 13:16:48.897817 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:49.897144 master-2 kubenswrapper[4762]: I1014 13:16:49.897051 4762 patch_prober.go:28] interesting pod/router-default-5ddb89f76-887cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 14 13:16:49.897144 master-2 kubenswrapper[4762]: [-]has-synced failed: reason withheld Oct 14 13:16:49.897144 master-2 kubenswrapper[4762]: [+]process-running ok Oct 14 13:16:49.897144 master-2 kubenswrapper[4762]: healthz check failed Oct 14 13:16:49.897912 master-2 kubenswrapper[4762]: I1014 13:16:49.897211 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:16:49.897912 master-2 kubenswrapper[4762]: I1014 13:16:49.897305 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:16:49.898234 master-2 kubenswrapper[4762]: I1014 13:16:49.898189 4762 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"057583bf3783414547befa224cdecf27ac7a84e0c0a4ef9a6cd7473f3af7d3db"} pod="openshift-ingress/router-default-5ddb89f76-887cs" containerMessage="Container router failed startup probe, will be restarted" Oct 14 13:16:49.898307 master-2 kubenswrapper[4762]: I1014 13:16:49.898264 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5ddb89f76-887cs" podUID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerName="router" containerID="cri-o://057583bf3783414547befa224cdecf27ac7a84e0c0a4ef9a6cd7473f3af7d3db" gracePeriod=3600 Oct 14 13:16:51.146790 master-2 kubenswrapper[4762]: I1014 13:16:51.146717 4762 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 14 13:16:53.657274 master-2 kubenswrapper[4762]: I1014 13:16:53.657183 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:53.658501 master-2 kubenswrapper[4762]: I1014 13:16:53.658361 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:16:55.335187 master-2 kubenswrapper[4762]: I1014 13:16:55.335072 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:16:55.336003 master-2 kubenswrapper[4762]: I1014 13:16:55.335211 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:00.335974 master-2 kubenswrapper[4762]: I1014 13:17:00.335841 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:17:00.336862 master-2 kubenswrapper[4762]: I1014 13:17:00.335999 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:04.658073 master-2 kubenswrapper[4762]: I1014 13:17:04.657946 4762 patch_prober.go:28] interesting pod/etcd-master-2 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:17:04.659042 master-2 kubenswrapper[4762]: I1014 13:17:04.658093 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:05.336783 master-2 kubenswrapper[4762]: I1014 13:17:05.336681 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:17:05.336783 master-2 kubenswrapper[4762]: I1014 13:17:05.336774 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: I1014 13:17:05.635221 4762 patch_prober.go:28] interesting pod/metrics-server-8475fbcb68-8dq9n container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]metric-storage-ready ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]metric-informer-sync ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [+]metadata-informer-sync ok Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:17:05.635414 master-2 kubenswrapper[4762]: I1014 13:17:05.635326 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:17:10.111586 master-2 kubenswrapper[4762]: I1014 13:17:10.111526 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-zxkbj_063758c3-98fe-4ac4-b1c2-beef7ef6dcdc/machine-config-server/0.log" Oct 14 13:17:10.337741 master-2 kubenswrapper[4762]: I1014 13:17:10.337645 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 14 13:17:10.338080 master-2 kubenswrapper[4762]: I1014 13:17:10.337746 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:10.705086 master-2 kubenswrapper[4762]: I1014 13:17:10.705006 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:17:13.675962 master-2 kubenswrapper[4762]: I1014 13:17:13.675875 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-2" Oct 14 13:17:13.698375 master-2 kubenswrapper[4762]: I1014 13:17:13.698315 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-2" Oct 14 13:17:18.891400 master-2 kubenswrapper[4762]: I1014 13:17:18.891331 4762 generic.go:334] "Generic (PLEG): container finished" podID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerID="bbd14ec96da76e6b6b207839405f5858f9bdefe7cec9b0ffa533a5f314702f25" exitCode=0 Oct 14 13:17:18.891400 master-2 kubenswrapper[4762]: I1014 13:17:18.891382 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" event={"ID":"949ffee6-8997-4b92-84c3-4aeb1121bbe1","Type":"ContainerDied","Data":"bbd14ec96da76e6b6b207839405f5858f9bdefe7cec9b0ffa533a5f314702f25"} Oct 14 13:17:19.085241 master-2 kubenswrapper[4762]: I1014 13:17:19.085189 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:17:19.193004 master-2 kubenswrapper[4762]: I1014 13:17:19.192922 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193004 master-2 kubenswrapper[4762]: I1014 13:17:19.193006 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tx5d\" (UniqueName: \"kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193732 master-2 kubenswrapper[4762]: I1014 13:17:19.193069 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193732 master-2 kubenswrapper[4762]: I1014 13:17:19.193187 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193732 master-2 kubenswrapper[4762]: I1014 13:17:19.193329 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193732 master-2 kubenswrapper[4762]: I1014 13:17:19.193374 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.193732 master-2 kubenswrapper[4762]: I1014 13:17:19.193407 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle\") pod \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\" (UID: \"949ffee6-8997-4b92-84c3-4aeb1121bbe1\") " Oct 14 13:17:19.195040 master-2 kubenswrapper[4762]: I1014 13:17:19.194966 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:19.195300 master-2 kubenswrapper[4762]: I1014 13:17:19.195200 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log" (OuterVolumeSpecName: "audit-log") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:17:19.195794 master-2 kubenswrapper[4762]: I1014 13:17:19.195712 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:17:19.199357 master-2 kubenswrapper[4762]: I1014 13:17:19.199245 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:19.199545 master-2 kubenswrapper[4762]: I1014 13:17:19.199488 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d" (OuterVolumeSpecName: "kube-api-access-9tx5d") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "kube-api-access-9tx5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:17:19.199639 master-2 kubenswrapper[4762]: I1014 13:17:19.199587 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:19.199922 master-2 kubenswrapper[4762]: I1014 13:17:19.199807 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "949ffee6-8997-4b92-84c3-4aeb1121bbe1" (UID: "949ffee6-8997-4b92-84c3-4aeb1121bbe1"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:17:19.295350 master-2 kubenswrapper[4762]: I1014 13:17:19.295266 4762 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-server-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295350 master-2 kubenswrapper[4762]: I1014 13:17:19.295332 4762 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-secret-metrics-client-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295350 master-2 kubenswrapper[4762]: I1014 13:17:19.295355 4762 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/949ffee6-8997-4b92-84c3-4aeb1121bbe1-audit-log\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295824 master-2 kubenswrapper[4762]: I1014 13:17:19.295378 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949ffee6-8997-4b92-84c3-4aeb1121bbe1-client-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295824 master-2 kubenswrapper[4762]: I1014 13:17:19.295397 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-metrics-server-audit-profiles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295824 master-2 kubenswrapper[4762]: I1014 13:17:19.295417 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tx5d\" (UniqueName: \"kubernetes.io/projected/949ffee6-8997-4b92-84c3-4aeb1121bbe1-kube-api-access-9tx5d\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.295824 master-2 kubenswrapper[4762]: I1014 13:17:19.295435 4762 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/949ffee6-8997-4b92-84c3-4aeb1121bbe1-configmap-kubelet-serving-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:17:19.901891 master-2 kubenswrapper[4762]: I1014 13:17:19.901700 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" event={"ID":"949ffee6-8997-4b92-84c3-4aeb1121bbe1","Type":"ContainerDied","Data":"7b246d3ce744eca8eaaea6e71479fe531a610918f55baa77f52873670b4b79e9"} Oct 14 13:17:19.901891 master-2 kubenswrapper[4762]: I1014 13:17:19.901795 4762 scope.go:117] "RemoveContainer" containerID="bbd14ec96da76e6b6b207839405f5858f9bdefe7cec9b0ffa533a5f314702f25" Oct 14 13:17:19.901891 master-2 kubenswrapper[4762]: I1014 13:17:19.901826 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" Oct 14 13:17:26.887667 master-2 kubenswrapper[4762]: E1014 13:17:26.887578 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.ocp.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-2?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:36.888568 master-2 kubenswrapper[4762]: E1014 13:17:36.888422 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.ocp.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-2?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:37.020989 master-2 kubenswrapper[4762]: I1014 13:17:37.020916 4762 generic.go:334] "Generic (PLEG): container finished" podID="f82e0c58-e2a3-491a-bf03-ad47b38c5833" containerID="057583bf3783414547befa224cdecf27ac7a84e0c0a4ef9a6cd7473f3af7d3db" exitCode=0 Oct 14 13:17:37.020989 master-2 kubenswrapper[4762]: I1014 13:17:37.020975 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerDied","Data":"057583bf3783414547befa224cdecf27ac7a84e0c0a4ef9a6cd7473f3af7d3db"} Oct 14 13:17:37.020989 master-2 kubenswrapper[4762]: I1014 13:17:37.021013 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ddb89f76-887cs" event={"ID":"f82e0c58-e2a3-491a-bf03-ad47b38c5833","Type":"ContainerStarted","Data":"dd35e911431438c0c336528085dfa67e391a78e300a8ad57a0438b5be17417ee"} Oct 14 13:17:37.021428 master-2 kubenswrapper[4762]: I1014 13:17:37.021039 4762 scope.go:117] "RemoveContainer" containerID="0a0bc0bcb6877389fc825b824eda24c17af5655791500c8ef2590eb00f894909" Oct 14 13:17:37.895107 master-2 kubenswrapper[4762]: I1014 13:17:37.894995 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:17:37.899200 master-2 kubenswrapper[4762]: I1014 13:17:37.899124 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:17:38.038474 master-2 kubenswrapper[4762]: I1014 13:17:38.038396 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:17:38.042392 master-2 kubenswrapper[4762]: I1014 13:17:38.042270 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5ddb89f76-887cs" Oct 14 13:17:46.889312 master-2 kubenswrapper[4762]: E1014 13:17:46.889085 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.ocp.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-2?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:56.890567 master-2 kubenswrapper[4762]: E1014 13:17:56.890479 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.ocp.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-2?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:17:57.163543 master-2 kubenswrapper[4762]: I1014 13:17:57.163472 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager/0.log" Oct 14 13:17:57.163543 master-2 kubenswrapper[4762]: I1014 13:17:57.163567 4762 generic.go:334] "Generic (PLEG): container finished" podID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerID="651a658d252f21b182795b06bd916191c32cb19c07c2f8bd03e20093679aa253" exitCode=1 Oct 14 13:17:57.163543 master-2 kubenswrapper[4762]: I1014 13:17:57.163621 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerDied","Data":"651a658d252f21b182795b06bd916191c32cb19c07c2f8bd03e20093679aa253"} Oct 14 13:17:57.164570 master-2 kubenswrapper[4762]: I1014 13:17:57.164437 4762 scope.go:117] "RemoveContainer" containerID="651a658d252f21b182795b06bd916191c32cb19c07c2f8bd03e20093679aa253" Oct 14 13:17:58.175483 master-2 kubenswrapper[4762]: I1014 13:17:58.175389 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager/0.log" Oct 14 13:17:58.176459 master-2 kubenswrapper[4762]: I1014 13:17:58.175504 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"94dc80ceea2f1dff7b9e2ec4d1d6aead","Type":"ContainerStarted","Data":"f6b31a7fd130b6260cc195b3f7d19cba7465489352bc38f184c6b11a0414791d"} Oct 14 13:18:03.207175 master-2 kubenswrapper[4762]: I1014 13:18:03.207078 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-5tzml_5e9cbb85-261b-485e-8bd4-b4d38108c06e/approver/0.log" Oct 14 13:18:03.208042 master-2 kubenswrapper[4762]: I1014 13:18:03.207772 4762 generic.go:334] "Generic (PLEG): container finished" podID="5e9cbb85-261b-485e-8bd4-b4d38108c06e" containerID="a4626dd24ee114b1165bf3e69a213527412105025e38e6fa6a694c9aeb3ab6d5" exitCode=1 Oct 14 13:18:03.208042 master-2 kubenswrapper[4762]: I1014 13:18:03.207840 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-5tzml" event={"ID":"5e9cbb85-261b-485e-8bd4-b4d38108c06e","Type":"ContainerDied","Data":"a4626dd24ee114b1165bf3e69a213527412105025e38e6fa6a694c9aeb3ab6d5"} Oct 14 13:18:03.208589 master-2 kubenswrapper[4762]: I1014 13:18:03.208522 4762 scope.go:117] "RemoveContainer" containerID="a4626dd24ee114b1165bf3e69a213527412105025e38e6fa6a694c9aeb3ab6d5" Oct 14 13:18:04.216547 master-2 kubenswrapper[4762]: I1014 13:18:04.216468 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-5tzml_5e9cbb85-261b-485e-8bd4-b4d38108c06e/approver/0.log" Oct 14 13:18:04.217537 master-2 kubenswrapper[4762]: I1014 13:18:04.216952 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-5tzml" event={"ID":"5e9cbb85-261b-485e-8bd4-b4d38108c06e","Type":"ContainerStarted","Data":"410b4e023c21c0af48b81df8a788e77cef1035739e4e4f3fc2729b86e2de6d54"} Oct 14 13:18:05.822378 master-2 kubenswrapper[4762]: I1014 13:18:05.822241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:18:05.822378 master-2 kubenswrapper[4762]: I1014 13:18:05.822350 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:18:05.829078 master-2 kubenswrapper[4762]: I1014 13:18:05.829010 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:18:06.891613 master-2 kubenswrapper[4762]: E1014 13:18:06.891490 4762 controller.go:195] "Failed to update lease" err="Put \"https://api-int.ocp.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-2?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:18:06.892572 master-2 kubenswrapper[4762]: I1014 13:18:06.891789 4762 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Oct 14 13:18:15.829585 master-2 kubenswrapper[4762]: I1014 13:18:15.829499 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:18:19.088574 master-2 kubenswrapper[4762]: I1014 13:18:19.088473 4762 status_manager.go:851] "Failed to get status for pod" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" pod="openshift-monitoring/metrics-server-8475fbcb68-8dq9n" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods metrics-server-8475fbcb68-8dq9n)" Oct 14 13:18:59.094823 master-2 kubenswrapper[4762]: I1014 13:18:59.094756 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:18:59.114182 master-2 kubenswrapper[4762]: I1014 13:18:59.105279 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-8475fbcb68-8dq9n"] Oct 14 13:18:59.554229 master-2 kubenswrapper[4762]: I1014 13:18:59.554133 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" path="/var/lib/kubelet/pods/949ffee6-8997-4b92-84c3-4aeb1121bbe1/volumes" Oct 14 13:18:59.920016 master-2 kubenswrapper[4762]: I1014 13:18:59.919952 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"openshift-service-ca.crt" Oct 14 13:18:59.929313 master-2 kubenswrapper[4762]: I1014 13:18:59.929272 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 14 13:18:59.961361 master-2 kubenswrapper[4762]: I1014 13:18:59.961314 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Oct 14 13:18:59.983945 master-2 kubenswrapper[4762]: I1014 13:18:59.983884 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Oct 14 13:19:00.002276 master-2 kubenswrapper[4762]: I1014 13:19:00.002207 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Oct 14 13:19:00.004567 master-2 kubenswrapper[4762]: I1014 13:19:00.004506 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 14 13:19:00.008377 master-2 kubenswrapper[4762]: I1014 13:19:00.008352 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:19:00.009122 master-2 kubenswrapper[4762]: I1014 13:19:00.009079 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:19:00.040826 master-2 kubenswrapper[4762]: I1014 13:19:00.040751 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:19:00.044487 master-2 kubenswrapper[4762]: I1014 13:19:00.044441 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:19:00.062577 master-2 kubenswrapper[4762]: I1014 13:19:00.062507 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 14 13:19:00.070095 master-2 kubenswrapper[4762]: I1014 13:19:00.070027 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Oct 14 13:19:00.088633 master-2 kubenswrapper[4762]: I1014 13:19:00.088577 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 14 13:19:00.106928 master-2 kubenswrapper[4762]: I1014 13:19:00.106850 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:19:00.107781 master-2 kubenswrapper[4762]: I1014 13:19:00.107749 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 14 13:19:00.117243 master-2 kubenswrapper[4762]: I1014 13:19:00.117180 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:19:00.163329 master-2 kubenswrapper[4762]: I1014 13:19:00.163265 4762 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 14 13:19:00.171213 master-2 kubenswrapper[4762]: I1014 13:19:00.171060 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Oct 14 13:19:00.181788 master-2 kubenswrapper[4762]: I1014 13:19:00.181742 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:19:00.187615 master-2 kubenswrapper[4762]: I1014 13:19:00.187592 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:19:00.241681 master-2 kubenswrapper[4762]: I1014 13:19:00.241627 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Oct 14 13:19:00.258735 master-2 kubenswrapper[4762]: I1014 13:19:00.258688 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Oct 14 13:19:00.294563 master-2 kubenswrapper[4762]: I1014 13:19:00.294479 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:19:00.314462 master-2 kubenswrapper[4762]: I1014 13:19:00.314393 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-8gpjk" Oct 14 13:19:00.439509 master-2 kubenswrapper[4762]: I1014 13:19:00.439341 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 14 13:19:00.439756 master-2 kubenswrapper[4762]: I1014 13:19:00.439621 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 14 13:19:00.442376 master-2 kubenswrapper[4762]: I1014 13:19:00.442321 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 14 13:19:00.477725 master-2 kubenswrapper[4762]: I1014 13:19:00.477672 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Oct 14 13:19:00.499450 master-2 kubenswrapper[4762]: I1014 13:19:00.499398 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 14 13:19:00.530575 master-2 kubenswrapper[4762]: I1014 13:19:00.530536 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Oct 14 13:19:00.554997 master-2 kubenswrapper[4762]: I1014 13:19:00.554957 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:19:00.555890 master-2 kubenswrapper[4762]: I1014 13:19:00.555875 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 14 13:19:00.571235 master-2 kubenswrapper[4762]: I1014 13:19:00.571185 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:19:00.576199 master-2 kubenswrapper[4762]: I1014 13:19:00.576144 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Oct 14 13:19:00.611261 master-2 kubenswrapper[4762]: I1014 13:19:00.606787 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 14 13:19:00.669781 master-2 kubenswrapper[4762]: I1014 13:19:00.669707 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 14 13:19:00.689100 master-2 kubenswrapper[4762]: I1014 13:19:00.689036 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"default-dockercfg-vv6lx" Oct 14 13:19:05.893420 master-2 kubenswrapper[4762]: I1014 13:19:05.893269 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:19:05.894716 master-2 kubenswrapper[4762]: I1014 13:19:05.893838 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" containerID="cri-o://86954708c4083ce02b3287f821780b8c962df87887fa2a2204ed39142954e4f0" gracePeriod=120 Oct 14 13:19:07.218931 master-2 kubenswrapper[4762]: I1014 13:19:07.217575 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:19:07.218931 master-2 kubenswrapper[4762]: I1014 13:19:07.217874 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" podUID="10f29de4-fd52-45da-a0d9-b9cb67146af1" containerName="controller-manager" containerID="cri-o://5766e6ab7e0fbe733c4c8f035d22a06c9a1742f998c96cf3b6ae13fa635e1fdd" gracePeriod=30 Oct 14 13:19:07.352241 master-2 kubenswrapper[4762]: I1014 13:19:07.352170 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:19:07.352574 master-2 kubenswrapper[4762]: I1014 13:19:07.352512 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" podUID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" containerName="route-controller-manager" containerID="cri-o://3f1909d8c7190ca600beb76aceba301114ff7f52320e80833aa4a063f90a9100" gracePeriod=30 Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: I1014 13:19:07.548577 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:07.548634 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:07.549383 master-2 kubenswrapper[4762]: I1014 13:19:07.548646 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:07.657400 master-2 kubenswrapper[4762]: I1014 13:19:07.656539 4762 generic.go:334] "Generic (PLEG): container finished" podID="10f29de4-fd52-45da-a0d9-b9cb67146af1" containerID="5766e6ab7e0fbe733c4c8f035d22a06c9a1742f998c96cf3b6ae13fa635e1fdd" exitCode=0 Oct 14 13:19:07.657400 master-2 kubenswrapper[4762]: I1014 13:19:07.656666 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" event={"ID":"10f29de4-fd52-45da-a0d9-b9cb67146af1","Type":"ContainerDied","Data":"5766e6ab7e0fbe733c4c8f035d22a06c9a1742f998c96cf3b6ae13fa635e1fdd"} Oct 14 13:19:07.664810 master-2 kubenswrapper[4762]: I1014 13:19:07.661824 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-ddd7d64cd-hph6v_0500c75a-3460-4279-a8d8-cebf242e6089/snapshot-controller/0.log" Oct 14 13:19:07.664810 master-2 kubenswrapper[4762]: I1014 13:19:07.661890 4762 generic.go:334] "Generic (PLEG): container finished" podID="0500c75a-3460-4279-a8d8-cebf242e6089" containerID="f1826adc7eb4b42b9fe04d89bcd7c131c240322cb4e3c10c8dc7650224a29fba" exitCode=1 Oct 14 13:19:07.664810 master-2 kubenswrapper[4762]: I1014 13:19:07.661975 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" event={"ID":"0500c75a-3460-4279-a8d8-cebf242e6089","Type":"ContainerDied","Data":"f1826adc7eb4b42b9fe04d89bcd7c131c240322cb4e3c10c8dc7650224a29fba"} Oct 14 13:19:07.664810 master-2 kubenswrapper[4762]: I1014 13:19:07.663091 4762 scope.go:117] "RemoveContainer" containerID="f1826adc7eb4b42b9fe04d89bcd7c131c240322cb4e3c10c8dc7650224a29fba" Oct 14 13:19:07.667770 master-2 kubenswrapper[4762]: I1014 13:19:07.667610 4762 generic.go:334] "Generic (PLEG): container finished" podID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" containerID="3f1909d8c7190ca600beb76aceba301114ff7f52320e80833aa4a063f90a9100" exitCode=0 Oct 14 13:19:07.667770 master-2 kubenswrapper[4762]: I1014 13:19:07.667679 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" event={"ID":"213dbdcb-5bd7-48a6-9365-1f643ea3bbea","Type":"ContainerDied","Data":"3f1909d8c7190ca600beb76aceba301114ff7f52320e80833aa4a063f90a9100"} Oct 14 13:19:07.760186 master-2 kubenswrapper[4762]: I1014 13:19:07.759889 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:19:07.835864 master-2 kubenswrapper[4762]: I1014 13:19:07.835810 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:19:07.843736 master-2 kubenswrapper[4762]: I1014 13:19:07.843691 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-2"] Oct 14 13:19:07.843939 master-2 kubenswrapper[4762]: E1014 13:19:07.843908 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f29de4-fd52-45da-a0d9-b9cb67146af1" containerName="controller-manager" Oct 14 13:19:07.843939 master-2 kubenswrapper[4762]: I1014 13:19:07.843935 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f29de4-fd52-45da-a0d9-b9cb67146af1" containerName="controller-manager" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: E1014 13:19:07.843959 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b313e4-ea57-4f9c-ad72-1f640ef21c52" containerName="installer" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: I1014 13:19:07.843969 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b313e4-ea57-4f9c-ad72-1f640ef21c52" containerName="installer" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: E1014 13:19:07.843978 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" containerName="route-controller-manager" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: I1014 13:19:07.843984 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" containerName="route-controller-manager" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: E1014 13:19:07.843993 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" Oct 14 13:19:07.844008 master-2 kubenswrapper[4762]: I1014 13:19:07.843999 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" Oct 14 13:19:07.844200 master-2 kubenswrapper[4762]: I1014 13:19:07.844077 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b313e4-ea57-4f9c-ad72-1f640ef21c52" containerName="installer" Oct 14 13:19:07.844200 master-2 kubenswrapper[4762]: I1014 13:19:07.844092 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="949ffee6-8997-4b92-84c3-4aeb1121bbe1" containerName="metrics-server" Oct 14 13:19:07.844200 master-2 kubenswrapper[4762]: I1014 13:19:07.844100 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f29de4-fd52-45da-a0d9-b9cb67146af1" containerName="controller-manager" Oct 14 13:19:07.844200 master-2 kubenswrapper[4762]: I1014 13:19:07.844107 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" containerName="route-controller-manager" Oct 14 13:19:07.844638 master-2 kubenswrapper[4762]: I1014 13:19:07.844609 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:07.848210 master-2 kubenswrapper[4762]: I1014 13:19:07.847421 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-sdwrm" Oct 14 13:19:07.872991 master-2 kubenswrapper[4762]: I1014 13:19:07.872941 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-2"] Oct 14 13:19:07.927550 master-2 kubenswrapper[4762]: I1014 13:19:07.927467 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config\") pod \"10f29de4-fd52-45da-a0d9-b9cb67146af1\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " Oct 14 13:19:07.927772 master-2 kubenswrapper[4762]: I1014 13:19:07.927560 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles\") pod \"10f29de4-fd52-45da-a0d9-b9cb67146af1\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " Oct 14 13:19:07.927772 master-2 kubenswrapper[4762]: I1014 13:19:07.927613 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrdhm\" (UniqueName: \"kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm\") pod \"10f29de4-fd52-45da-a0d9-b9cb67146af1\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " Oct 14 13:19:07.927858 master-2 kubenswrapper[4762]: I1014 13:19:07.927788 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca\") pod \"10f29de4-fd52-45da-a0d9-b9cb67146af1\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " Oct 14 13:19:07.927858 master-2 kubenswrapper[4762]: I1014 13:19:07.927824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert\") pod \"10f29de4-fd52-45da-a0d9-b9cb67146af1\" (UID: \"10f29de4-fd52-45da-a0d9-b9cb67146af1\") " Oct 14 13:19:07.927858 master-2 kubenswrapper[4762]: I1014 13:19:07.927845 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert\") pod \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " Oct 14 13:19:07.928090 master-2 kubenswrapper[4762]: I1014 13:19:07.928051 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:07.928142 master-2 kubenswrapper[4762]: I1014 13:19:07.928081 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "10f29de4-fd52-45da-a0d9-b9cb67146af1" (UID: "10f29de4-fd52-45da-a0d9-b9cb67146af1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:07.928142 master-2 kubenswrapper[4762]: I1014 13:19:07.928110 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:07.928267 master-2 kubenswrapper[4762]: I1014 13:19:07.928144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:07.928267 master-2 kubenswrapper[4762]: I1014 13:19:07.928179 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config" (OuterVolumeSpecName: "config") pod "10f29de4-fd52-45da-a0d9-b9cb67146af1" (UID: "10f29de4-fd52-45da-a0d9-b9cb67146af1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:07.928267 master-2 kubenswrapper[4762]: I1014 13:19:07.928244 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-proxy-ca-bundles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:07.929110 master-2 kubenswrapper[4762]: I1014 13:19:07.929015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca" (OuterVolumeSpecName: "client-ca") pod "10f29de4-fd52-45da-a0d9-b9cb67146af1" (UID: "10f29de4-fd52-45da-a0d9-b9cb67146af1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:07.932523 master-2 kubenswrapper[4762]: I1014 13:19:07.932450 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10f29de4-fd52-45da-a0d9-b9cb67146af1" (UID: "10f29de4-fd52-45da-a0d9-b9cb67146af1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:07.932777 master-2 kubenswrapper[4762]: I1014 13:19:07.932670 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm" (OuterVolumeSpecName: "kube-api-access-wrdhm") pod "10f29de4-fd52-45da-a0d9-b9cb67146af1" (UID: "10f29de4-fd52-45da-a0d9-b9cb67146af1"). InnerVolumeSpecName "kube-api-access-wrdhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:07.933514 master-2 kubenswrapper[4762]: I1014 13:19:07.933483 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "213dbdcb-5bd7-48a6-9365-1f643ea3bbea" (UID: "213dbdcb-5bd7-48a6-9365-1f643ea3bbea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:08.029049 master-2 kubenswrapper[4762]: I1014 13:19:08.028891 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca\") pod \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " Oct 14 13:19:08.029049 master-2 kubenswrapper[4762]: I1014 13:19:08.028987 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgz95\" (UniqueName: \"kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95\") pod \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " Oct 14 13:19:08.029307 master-2 kubenswrapper[4762]: I1014 13:19:08.029178 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config\") pod \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\" (UID: \"213dbdcb-5bd7-48a6-9365-1f643ea3bbea\") " Oct 14 13:19:08.029361 master-2 kubenswrapper[4762]: I1014 13:19:08.029305 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.029410 master-2 kubenswrapper[4762]: I1014 13:19:08.029368 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.029446 master-2 kubenswrapper[4762]: I1014 13:19:08.029427 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.029528 master-2 kubenswrapper[4762]: I1014 13:19:08.029485 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.029568 master-2 kubenswrapper[4762]: I1014 13:19:08.029520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.029603 master-2 kubenswrapper[4762]: I1014 13:19:08.029570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.029603 master-2 kubenswrapper[4762]: I1014 13:19:08.029532 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f29de4-fd52-45da-a0d9-b9cb67146af1-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.029662 master-2 kubenswrapper[4762]: I1014 13:19:08.029616 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.029662 master-2 kubenswrapper[4762]: I1014 13:19:08.029633 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f29de4-fd52-45da-a0d9-b9cb67146af1-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.029662 master-2 kubenswrapper[4762]: I1014 13:19:08.029649 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrdhm\" (UniqueName: \"kubernetes.io/projected/10f29de4-fd52-45da-a0d9-b9cb67146af1-kube-api-access-wrdhm\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.029840 master-2 kubenswrapper[4762]: I1014 13:19:08.029754 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca" (OuterVolumeSpecName: "client-ca") pod "213dbdcb-5bd7-48a6-9365-1f643ea3bbea" (UID: "213dbdcb-5bd7-48a6-9365-1f643ea3bbea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:08.030397 master-2 kubenswrapper[4762]: I1014 13:19:08.030330 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config" (OuterVolumeSpecName: "config") pod "213dbdcb-5bd7-48a6-9365-1f643ea3bbea" (UID: "213dbdcb-5bd7-48a6-9365-1f643ea3bbea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:08.034788 master-2 kubenswrapper[4762]: I1014 13:19:08.034566 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95" (OuterVolumeSpecName: "kube-api-access-dgz95") pod "213dbdcb-5bd7-48a6-9365-1f643ea3bbea" (UID: "213dbdcb-5bd7-48a6-9365-1f643ea3bbea"). InnerVolumeSpecName "kube-api-access-dgz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:08.060356 master-2 kubenswrapper[4762]: I1014 13:19:08.060288 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access\") pod \"installer-5-master-2\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.131393 master-2 kubenswrapper[4762]: I1014 13:19:08.131295 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.131393 master-2 kubenswrapper[4762]: I1014 13:19:08.131336 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.132024 master-2 kubenswrapper[4762]: I1014 13:19:08.131476 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgz95\" (UniqueName: \"kubernetes.io/projected/213dbdcb-5bd7-48a6-9365-1f643ea3bbea-kube-api-access-dgz95\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:08.157354 master-2 kubenswrapper[4762]: I1014 13:19:08.157097 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:08.664299 master-2 kubenswrapper[4762]: I1014 13:19:08.661633 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-2"] Oct 14 13:19:08.689458 master-2 kubenswrapper[4762]: I1014 13:19:08.689212 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-ddd7d64cd-hph6v_0500c75a-3460-4279-a8d8-cebf242e6089/snapshot-controller/0.log" Oct 14 13:19:08.689629 master-2 kubenswrapper[4762]: I1014 13:19:08.689456 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-ddd7d64cd-hph6v" event={"ID":"0500c75a-3460-4279-a8d8-cebf242e6089","Type":"ContainerStarted","Data":"f809ac25b1cd8ab36d7e7e3881225e788f414f161ddf5b1d43da6b86636ac3be"} Oct 14 13:19:08.692439 master-2 kubenswrapper[4762]: I1014 13:19:08.691731 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" event={"ID":"213dbdcb-5bd7-48a6-9365-1f643ea3bbea","Type":"ContainerDied","Data":"7466ead3c472256701902904780dd7d9eb11f2bd98bf9d9fbc6f9e51d477a519"} Oct 14 13:19:08.692439 master-2 kubenswrapper[4762]: I1014 13:19:08.691798 4762 scope.go:117] "RemoveContainer" containerID="3f1909d8c7190ca600beb76aceba301114ff7f52320e80833aa4a063f90a9100" Oct 14 13:19:08.692439 master-2 kubenswrapper[4762]: I1014 13:19:08.691946 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz" Oct 14 13:19:08.705602 master-2 kubenswrapper[4762]: I1014 13:19:08.705541 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" event={"ID":"10f29de4-fd52-45da-a0d9-b9cb67146af1","Type":"ContainerDied","Data":"5fa1d42af7af58971b70d518787f54e6878842e22b5a3b27be7370a0308b47fa"} Oct 14 13:19:08.705804 master-2 kubenswrapper[4762]: I1014 13:19:08.705631 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck" Oct 14 13:19:08.714558 master-2 kubenswrapper[4762]: I1014 13:19:08.714510 4762 scope.go:117] "RemoveContainer" containerID="5766e6ab7e0fbe733c4c8f035d22a06c9a1742f998c96cf3b6ae13fa635e1fdd" Oct 14 13:19:08.795986 master-2 kubenswrapper[4762]: I1014 13:19:08.795890 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:19:08.808632 master-2 kubenswrapper[4762]: I1014 13:19:08.808469 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56cfb99cfd-rq5ck"] Oct 14 13:19:08.840902 master-2 kubenswrapper[4762]: I1014 13:19:08.840842 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:19:08.868468 master-2 kubenswrapper[4762]: I1014 13:19:08.868410 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77674cffc8-gf5tz"] Oct 14 13:19:09.555598 master-2 kubenswrapper[4762]: I1014 13:19:09.555544 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f29de4-fd52-45da-a0d9-b9cb67146af1" path="/var/lib/kubelet/pods/10f29de4-fd52-45da-a0d9-b9cb67146af1/volumes" Oct 14 13:19:09.556083 master-2 kubenswrapper[4762]: I1014 13:19:09.556057 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213dbdcb-5bd7-48a6-9365-1f643ea3bbea" path="/var/lib/kubelet/pods/213dbdcb-5bd7-48a6-9365-1f643ea3bbea/volumes" Oct 14 13:19:09.718847 master-2 kubenswrapper[4762]: I1014 13:19:09.718746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-2" event={"ID":"7d69ccf3-0dde-4f0a-acb0-beb8112b2650","Type":"ContainerStarted","Data":"4ea3fb71eac2d76d281838a3fb8e1644651fc647f1eff0be95ee902c3ac431af"} Oct 14 13:19:09.718847 master-2 kubenswrapper[4762]: I1014 13:19:09.718809 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-2" event={"ID":"7d69ccf3-0dde-4f0a-acb0-beb8112b2650","Type":"ContainerStarted","Data":"5bba306527c096aa0adf358bdd76a124de5e9e9905f5d2116ac98a8240a55288"} Oct 14 13:19:09.722698 master-2 kubenswrapper[4762]: I1014 13:19:09.722635 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 13:19:09.723420 master-2 kubenswrapper[4762]: I1014 13:19:09.723378 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/config-sync-controllers/0.log" Oct 14 13:19:09.724012 master-2 kubenswrapper[4762]: I1014 13:19:09.723964 4762 generic.go:334] "Generic (PLEG): container finished" podID="18346e46-a062-4e0d-b90a-c05646a46c7e" containerID="18f5eac455c3f97c664881c24b67138f1d4f342782bcadba7e8667d46225fa69" exitCode=1 Oct 14 13:19:09.724209 master-2 kubenswrapper[4762]: I1014 13:19:09.724023 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerDied","Data":"18f5eac455c3f97c664881c24b67138f1d4f342782bcadba7e8667d46225fa69"} Oct 14 13:19:09.724956 master-2 kubenswrapper[4762]: I1014 13:19:09.724925 4762 scope.go:117] "RemoveContainer" containerID="18f5eac455c3f97c664881c24b67138f1d4f342782bcadba7e8667d46225fa69" Oct 14 13:19:09.795433 master-2 kubenswrapper[4762]: I1014 13:19:09.795142 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-2" podStartSLOduration=2.795108304 podStartE2EDuration="2.795108304s" podCreationTimestamp="2025-10-14 13:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:09.794290668 +0000 UTC m=+779.038449837" watchObservedRunningTime="2025-10-14 13:19:09.795108304 +0000 UTC m=+779.039267463" Oct 14 13:19:10.731655 master-2 kubenswrapper[4762]: I1014 13:19:10.731606 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 13:19:10.732289 master-2 kubenswrapper[4762]: I1014 13:19:10.732257 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/config-sync-controllers/0.log" Oct 14 13:19:10.732846 master-2 kubenswrapper[4762]: I1014 13:19:10.732812 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-779749f859-bscv5" event={"ID":"18346e46-a062-4e0d-b90a-c05646a46c7e","Type":"ContainerStarted","Data":"a358d7abda0d485061e6c1f4e284a26ddbef31d773fd97d91d6c75ae8ad87317"} Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: I1014 13:19:12.542587 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:12.542744 master-2 kubenswrapper[4762]: I1014 13:19:12.542710 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: I1014 13:19:17.544034 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:17.544131 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:17.545599 master-2 kubenswrapper[4762]: I1014 13:19:17.544177 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:17.545599 master-2 kubenswrapper[4762]: I1014 13:19:17.544333 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: I1014 13:19:22.544661 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:22.544776 master-2 kubenswrapper[4762]: I1014 13:19:22.544773 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:25.880379 master-2 kubenswrapper[4762]: I1014 13:19:25.880310 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-2"] Oct 14 13:19:25.881112 master-2 kubenswrapper[4762]: I1014 13:19:25.881081 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:25.885949 master-2 kubenswrapper[4762]: I1014 13:19:25.885890 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bm6wx" Oct 14 13:19:25.902290 master-2 kubenswrapper[4762]: I1014 13:19:25.902202 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-2"] Oct 14 13:19:26.060347 master-2 kubenswrapper[4762]: I1014 13:19:26.060238 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.060347 master-2 kubenswrapper[4762]: I1014 13:19:26.060335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.060663 master-2 kubenswrapper[4762]: I1014 13:19:26.060394 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.161802 master-2 kubenswrapper[4762]: I1014 13:19:26.161719 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.161802 master-2 kubenswrapper[4762]: I1014 13:19:26.161803 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.162093 master-2 kubenswrapper[4762]: I1014 13:19:26.161849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.162093 master-2 kubenswrapper[4762]: I1014 13:19:26.161954 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.162481 master-2 kubenswrapper[4762]: I1014 13:19:26.162420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.185979 master-2 kubenswrapper[4762]: I1014 13:19:26.185876 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access\") pod \"installer-6-master-2\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.197628 master-2 kubenswrapper[4762]: I1014 13:19:26.197569 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:19:26.656484 master-2 kubenswrapper[4762]: I1014 13:19:26.656322 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-2"] Oct 14 13:19:26.665539 master-2 kubenswrapper[4762]: W1014 13:19:26.665467 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9cd92a0e_7d77_4c47_82ef_98aefb24c268.slice/crio-b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618 WatchSource:0}: Error finding container b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618: Status 404 returned error can't find the container with id b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618 Oct 14 13:19:26.828773 master-2 kubenswrapper[4762]: I1014 13:19:26.828695 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-2" event={"ID":"9cd92a0e-7d77-4c47-82ef-98aefb24c268","Type":"ContainerStarted","Data":"b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618"} Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: I1014 13:19:27.544107 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:27.544175 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:27.545246 master-2 kubenswrapper[4762]: I1014 13:19:27.545211 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:27.837646 master-2 kubenswrapper[4762]: I1014 13:19:27.837308 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-2" event={"ID":"9cd92a0e-7d77-4c47-82ef-98aefb24c268","Type":"ContainerStarted","Data":"7cd64116c28c9e715c389053cb78bc23f54f32883375f9a31eacad651b6d8a63"} Oct 14 13:19:27.868174 master-2 kubenswrapper[4762]: I1014 13:19:27.868051 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-2" podStartSLOduration=2.868022008 podStartE2EDuration="2.868022008s" podCreationTimestamp="2025-10-14 13:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:27.865137522 +0000 UTC m=+797.109296721" watchObservedRunningTime="2025-10-14 13:19:27.868022008 +0000 UTC m=+797.112181207" Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: I1014 13:19:32.543125 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:32.543282 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:32.545319 master-2 kubenswrapper[4762]: I1014 13:19:32.543274 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: I1014 13:19:37.545240 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:37.545333 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:37.547101 master-2 kubenswrapper[4762]: I1014 13:19:37.545356 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:42.073522 master-2 kubenswrapper[4762]: I1014 13:19:42.073425 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:19:42.074718 master-2 kubenswrapper[4762]: I1014 13:19:42.073928 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="cluster-policy-controller" containerID="cri-o://e45f2b094e6806403cfa8da2bf527b04ff4b8ae6e1a18580c31fcc2301b38ee9" gracePeriod=30 Oct 14 13:19:42.074718 master-2 kubenswrapper[4762]: I1014 13:19:42.073977 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://67e3eecae682a65c0dea3a2495e130d1fb9f92e0a4de76a1793c299e38cffbf0" gracePeriod=30 Oct 14 13:19:42.074718 master-2 kubenswrapper[4762]: I1014 13:19:42.073991 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://d7e365eb8e01212deb310e30a49986c32ca5d3a702e94995aab2751ca5e8f908" gracePeriod=30 Oct 14 13:19:42.074718 master-2 kubenswrapper[4762]: I1014 13:19:42.074046 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" containerID="cri-o://f6b31a7fd130b6260cc195b3f7d19cba7465489352bc38f184c6b11a0414791d" gracePeriod=30 Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.075573 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: E1014 13:19:42.075773 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="cluster-policy-controller" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.075788 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="cluster-policy-controller" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: E1014 13:19:42.075802 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-cert-syncer" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.075811 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-cert-syncer" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: E1014 13:19:42.075824 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-recovery-controller" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.075832 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-recovery-controller" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: E1014 13:19:42.075851 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.075859 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.076013 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-cert-syncer" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.076028 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="cluster-policy-controller" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.076043 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.076057 master-2 kubenswrapper[4762]: I1014 13:19:42.076055 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager-recovery-controller" Oct 14 13:19:42.077100 master-2 kubenswrapper[4762]: E1014 13:19:42.076172 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.077100 master-2 kubenswrapper[4762]: I1014 13:19:42.076183 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.077100 master-2 kubenswrapper[4762]: I1014 13:19:42.076278 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerName="kube-controller-manager" Oct 14 13:19:42.200899 master-2 kubenswrapper[4762]: I1014 13:19:42.200546 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.200899 master-2 kubenswrapper[4762]: I1014 13:19:42.200721 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.302402 master-2 kubenswrapper[4762]: I1014 13:19:42.302280 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.302402 master-2 kubenswrapper[4762]: I1014 13:19:42.302353 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.302871 master-2 kubenswrapper[4762]: I1014 13:19:42.302456 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.302871 master-2 kubenswrapper[4762]: I1014 13:19:42.302463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.350799 master-2 kubenswrapper[4762]: I1014 13:19:42.350589 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager-cert-syncer/0.log" Oct 14 13:19:42.351691 master-2 kubenswrapper[4762]: I1014 13:19:42.351639 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager/0.log" Oct 14 13:19:42.351812 master-2 kubenswrapper[4762]: I1014 13:19:42.351736 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.358725 master-2 kubenswrapper[4762]: I1014 13:19:42.358668 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" podUID="d9e75646502e68dc8cb077ea618d4d9d" Oct 14 13:19:42.504930 master-2 kubenswrapper[4762]: I1014 13:19:42.504815 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir\") pod \"94dc80ceea2f1dff7b9e2ec4d1d6aead\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " Oct 14 13:19:42.504930 master-2 kubenswrapper[4762]: I1014 13:19:42.504888 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir\") pod \"94dc80ceea2f1dff7b9e2ec4d1d6aead\" (UID: \"94dc80ceea2f1dff7b9e2ec4d1d6aead\") " Oct 14 13:19:42.504930 master-2 kubenswrapper[4762]: I1014 13:19:42.504932 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "94dc80ceea2f1dff7b9e2ec4d1d6aead" (UID: "94dc80ceea2f1dff7b9e2ec4d1d6aead"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:42.505714 master-2 kubenswrapper[4762]: I1014 13:19:42.505126 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "94dc80ceea2f1dff7b9e2ec4d1d6aead" (UID: "94dc80ceea2f1dff7b9e2ec4d1d6aead"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:42.505714 master-2 kubenswrapper[4762]: I1014 13:19:42.505192 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: I1014 13:19:42.543265 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:42.543351 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:42.544603 master-2 kubenswrapper[4762]: I1014 13:19:42.543339 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:42.606902 master-2 kubenswrapper[4762]: I1014 13:19:42.606676 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/94dc80ceea2f1dff7b9e2ec4d1d6aead-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:42.941240 master-2 kubenswrapper[4762]: I1014 13:19:42.941108 4762 generic.go:334] "Generic (PLEG): container finished" podID="7d69ccf3-0dde-4f0a-acb0-beb8112b2650" containerID="4ea3fb71eac2d76d281838a3fb8e1644651fc647f1eff0be95ee902c3ac431af" exitCode=0 Oct 14 13:19:42.941551 master-2 kubenswrapper[4762]: I1014 13:19:42.941283 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-2" event={"ID":"7d69ccf3-0dde-4f0a-acb0-beb8112b2650","Type":"ContainerDied","Data":"4ea3fb71eac2d76d281838a3fb8e1644651fc647f1eff0be95ee902c3ac431af"} Oct 14 13:19:42.949121 master-2 kubenswrapper[4762]: I1014 13:19:42.949025 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager-cert-syncer/0.log" Oct 14 13:19:42.950874 master-2 kubenswrapper[4762]: I1014 13:19:42.950826 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager/0.log" Oct 14 13:19:42.950975 master-2 kubenswrapper[4762]: I1014 13:19:42.950903 4762 generic.go:334] "Generic (PLEG): container finished" podID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerID="f6b31a7fd130b6260cc195b3f7d19cba7465489352bc38f184c6b11a0414791d" exitCode=0 Oct 14 13:19:42.950975 master-2 kubenswrapper[4762]: I1014 13:19:42.950926 4762 generic.go:334] "Generic (PLEG): container finished" podID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerID="d7e365eb8e01212deb310e30a49986c32ca5d3a702e94995aab2751ca5e8f908" exitCode=0 Oct 14 13:19:42.950975 master-2 kubenswrapper[4762]: I1014 13:19:42.950945 4762 generic.go:334] "Generic (PLEG): container finished" podID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerID="67e3eecae682a65c0dea3a2495e130d1fb9f92e0a4de76a1793c299e38cffbf0" exitCode=2 Oct 14 13:19:42.950975 master-2 kubenswrapper[4762]: I1014 13:19:42.950958 4762 generic.go:334] "Generic (PLEG): container finished" podID="94dc80ceea2f1dff7b9e2ec4d1d6aead" containerID="e45f2b094e6806403cfa8da2bf527b04ff4b8ae6e1a18580c31fcc2301b38ee9" exitCode=0 Oct 14 13:19:42.951219 master-2 kubenswrapper[4762]: I1014 13:19:42.951007 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0cc7a1e60434ea8a5ba6e8991270c99a53fd407f44577efc6c6487ad12f546" Oct 14 13:19:42.951219 master-2 kubenswrapper[4762]: I1014 13:19:42.951036 4762 scope.go:117] "RemoveContainer" containerID="651a658d252f21b182795b06bd916191c32cb19c07c2f8bd03e20093679aa253" Oct 14 13:19:42.951394 master-2 kubenswrapper[4762]: I1014 13:19:42.951252 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:42.984996 master-2 kubenswrapper[4762]: I1014 13:19:42.984839 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" podUID="d9e75646502e68dc8cb077ea618d4d9d" Oct 14 13:19:43.006983 master-2 kubenswrapper[4762]: I1014 13:19:43.006906 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" podUID="d9e75646502e68dc8cb077ea618d4d9d" Oct 14 13:19:43.559063 master-2 kubenswrapper[4762]: I1014 13:19:43.558956 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94dc80ceea2f1dff7b9e2ec4d1d6aead" path="/var/lib/kubelet/pods/94dc80ceea2f1dff7b9e2ec4d1d6aead/volumes" Oct 14 13:19:43.848069 master-2 kubenswrapper[4762]: I1014 13:19:43.847912 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:19:43.848069 master-2 kubenswrapper[4762]: I1014 13:19:43.848001 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:19:43.963366 master-2 kubenswrapper[4762]: I1014 13:19:43.963288 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_94dc80ceea2f1dff7b9e2ec4d1d6aead/kube-controller-manager-cert-syncer/0.log" Oct 14 13:19:44.325740 master-2 kubenswrapper[4762]: I1014 13:19:44.325660 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:44.427092 master-2 kubenswrapper[4762]: I1014 13:19:44.426984 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access\") pod \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " Oct 14 13:19:44.427438 master-2 kubenswrapper[4762]: I1014 13:19:44.427379 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock\") pod \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " Oct 14 13:19:44.427539 master-2 kubenswrapper[4762]: I1014 13:19:44.427434 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir\") pod \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\" (UID: \"7d69ccf3-0dde-4f0a-acb0-beb8112b2650\") " Oct 14 13:19:44.427539 master-2 kubenswrapper[4762]: I1014 13:19:44.427496 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock" (OuterVolumeSpecName: "var-lock") pod "7d69ccf3-0dde-4f0a-acb0-beb8112b2650" (UID: "7d69ccf3-0dde-4f0a-acb0-beb8112b2650"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:44.427759 master-2 kubenswrapper[4762]: I1014 13:19:44.427687 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7d69ccf3-0dde-4f0a-acb0-beb8112b2650" (UID: "7d69ccf3-0dde-4f0a-acb0-beb8112b2650"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:44.428260 master-2 kubenswrapper[4762]: I1014 13:19:44.428207 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:44.428260 master-2 kubenswrapper[4762]: I1014 13:19:44.428255 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:44.432221 master-2 kubenswrapper[4762]: I1014 13:19:44.432095 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7d69ccf3-0dde-4f0a-acb0-beb8112b2650" (UID: "7d69ccf3-0dde-4f0a-acb0-beb8112b2650"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:44.529520 master-2 kubenswrapper[4762]: I1014 13:19:44.529369 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d69ccf3-0dde-4f0a-acb0-beb8112b2650-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:44.973730 master-2 kubenswrapper[4762]: I1014 13:19:44.973622 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-2" event={"ID":"7d69ccf3-0dde-4f0a-acb0-beb8112b2650","Type":"ContainerDied","Data":"5bba306527c096aa0adf358bdd76a124de5e9e9905f5d2116ac98a8240a55288"} Oct 14 13:19:44.973730 master-2 kubenswrapper[4762]: I1014 13:19:44.973688 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bba306527c096aa0adf358bdd76a124de5e9e9905f5d2116ac98a8240a55288" Oct 14 13:19:44.973730 master-2 kubenswrapper[4762]: I1014 13:19:44.973709 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-2" Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: I1014 13:19:47.543143 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:47.543263 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:47.545124 master-2 kubenswrapper[4762]: I1014 13:19:47.543268 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:48.850961 master-2 kubenswrapper[4762]: I1014 13:19:48.849370 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:19:48.850961 master-2 kubenswrapper[4762]: I1014 13:19:48.849455 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: I1014 13:19:52.540861 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:19:52.540944 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:19:52.542027 master-2 kubenswrapper[4762]: I1014 13:19:52.540951 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:19:53.548390 master-2 kubenswrapper[4762]: I1014 13:19:53.548322 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:53.567734 master-2 kubenswrapper[4762]: I1014 13:19:53.567683 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="be6eb3ed-10f3-4eb6-9e37-8f4070f1403f" Oct 14 13:19:53.567734 master-2 kubenswrapper[4762]: I1014 13:19:53.567734 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="be6eb3ed-10f3-4eb6-9e37-8f4070f1403f" Oct 14 13:19:53.586567 master-2 kubenswrapper[4762]: I1014 13:19:53.586476 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:19:53.594849 master-2 kubenswrapper[4762]: I1014 13:19:53.594806 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:53.598925 master-2 kubenswrapper[4762]: I1014 13:19:53.598891 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:19:53.623419 master-2 kubenswrapper[4762]: I1014 13:19:53.623338 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:19:53.627191 master-2 kubenswrapper[4762]: I1014 13:19:53.627117 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:19:53.652948 master-2 kubenswrapper[4762]: W1014 13:19:53.652593 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e75646502e68dc8cb077ea618d4d9d.slice/crio-f1eb43cc46ffb5c03f23444037744743d8aaa808025e20984932dff9631cae9e WatchSource:0}: Error finding container f1eb43cc46ffb5c03f23444037744743d8aaa808025e20984932dff9631cae9e: Status 404 returned error can't find the container with id f1eb43cc46ffb5c03f23444037744743d8aaa808025e20984932dff9631cae9e Oct 14 13:19:53.850072 master-2 kubenswrapper[4762]: I1014 13:19:53.850014 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:19:53.850295 master-2 kubenswrapper[4762]: I1014 13:19:53.850075 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:19:53.850295 master-2 kubenswrapper[4762]: I1014 13:19:53.850169 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:19:53.851877 master-2 kubenswrapper[4762]: I1014 13:19:53.851803 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:19:53.852233 master-2 kubenswrapper[4762]: I1014 13:19:53.852146 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:19:54.035142 master-2 kubenswrapper[4762]: I1014 13:19:54.035090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"d9e75646502e68dc8cb077ea618d4d9d","Type":"ContainerStarted","Data":"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a"} Oct 14 13:19:54.035142 master-2 kubenswrapper[4762]: I1014 13:19:54.035137 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"d9e75646502e68dc8cb077ea618d4d9d","Type":"ContainerStarted","Data":"f1eb43cc46ffb5c03f23444037744743d8aaa808025e20984932dff9631cae9e"} Oct 14 13:19:55.042691 master-2 kubenswrapper[4762]: I1014 13:19:55.042621 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"d9e75646502e68dc8cb077ea618d4d9d","Type":"ContainerStarted","Data":"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f"} Oct 14 13:19:55.042691 master-2 kubenswrapper[4762]: I1014 13:19:55.042672 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"d9e75646502e68dc8cb077ea618d4d9d","Type":"ContainerStarted","Data":"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a"} Oct 14 13:19:55.042691 master-2 kubenswrapper[4762]: I1014 13:19:55.042691 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"d9e75646502e68dc8cb077ea618d4d9d","Type":"ContainerStarted","Data":"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f"} Oct 14 13:19:55.076101 master-2 kubenswrapper[4762]: I1014 13:19:55.075989 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podStartSLOduration=2.075965587 podStartE2EDuration="2.075965587s" podCreationTimestamp="2025-10-14 13:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:19:55.072859394 +0000 UTC m=+824.317018593" watchObservedRunningTime="2025-10-14 13:19:55.075965587 +0000 UTC m=+824.320124756" Oct 14 13:19:57.538099 master-2 kubenswrapper[4762]: I1014 13:19:57.537967 4762 patch_prober.go:28] interesting pod/apiserver-7b6784d654-l7lmp container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.53:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.53:8443: connect: connection refused" start-of-body= Oct 14 13:19:57.538759 master-2 kubenswrapper[4762]: I1014 13:19:57.538114 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.129.0.53:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.53:8443: connect: connection refused" Oct 14 13:19:58.275848 master-2 kubenswrapper[4762]: I1014 13:19:58.275801 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:19:58.276609 master-2 kubenswrapper[4762]: I1014 13:19:58.276574 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler" containerID="cri-o://0da1f39434f36e2d44c9684fdf09ed6e485b933407a1476b3fb79bad430550c2" gracePeriod=30 Oct 14 13:19:58.276813 master-2 kubenswrapper[4762]: I1014 13:19:58.276690 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-recovery-controller" containerID="cri-o://2e6e16be70c882739cd9ce4e47fdaafa968150fcb22d8ff2377d2049b8b3beef" gracePeriod=30 Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.276749 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: E1014 13:19:58.277339 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-cert-syncer" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277375 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-cert-syncer" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277351 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-cert-syncer" containerID="cri-o://be6e28bd40aab1a8ebbfd27ebfe4eb591ba0b55eae92170c1b9f8a0a8a41306c" gracePeriod=30 Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: E1014 13:19:58.277407 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277531 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: E1014 13:19:58.277553 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-recovery-controller" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277564 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-recovery-controller" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: E1014 13:19:58.277579 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d69ccf3-0dde-4f0a-acb0-beb8112b2650" containerName="installer" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277591 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d69ccf3-0dde-4f0a-acb0-beb8112b2650" containerName="installer" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: E1014 13:19:58.277615 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="wait-for-host-port" Oct 14 13:19:58.277651 master-2 kubenswrapper[4762]: I1014 13:19:58.277627 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="wait-for-host-port" Oct 14 13:19:58.278565 master-2 kubenswrapper[4762]: I1014 13:19:58.277771 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-recovery-controller" Oct 14 13:19:58.278565 master-2 kubenswrapper[4762]: I1014 13:19:58.277804 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler-cert-syncer" Oct 14 13:19:58.278565 master-2 kubenswrapper[4762]: I1014 13:19:58.277819 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerName="kube-scheduler" Oct 14 13:19:58.278565 master-2 kubenswrapper[4762]: I1014 13:19:58.277835 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d69ccf3-0dde-4f0a-acb0-beb8112b2650" containerName="installer" Oct 14 13:19:58.335463 master-2 kubenswrapper[4762]: I1014 13:19:58.335366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.335647 master-2 kubenswrapper[4762]: I1014 13:19:58.335497 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.436618 master-2 kubenswrapper[4762]: I1014 13:19:58.436578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.436796 master-2 kubenswrapper[4762]: I1014 13:19:58.436741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-resource-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.436927 master-2 kubenswrapper[4762]: I1014 13:19:58.436901 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.437038 master-2 kubenswrapper[4762]: I1014 13:19:58.436969 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/09a1584aa5985a5ff9600248bcf73e77-cert-dir\") pod \"openshift-kube-scheduler-master-2\" (UID: \"09a1584aa5985a5ff9600248bcf73e77\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.524338 master-2 kubenswrapper[4762]: I1014 13:19:58.524252 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-2_f26cf13b1c8c4f1b57c0ac506ef256a4/kube-scheduler-cert-syncer/0.log" Oct 14 13:19:58.525481 master-2 kubenswrapper[4762]: I1014 13:19:58.525423 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:58.531256 master-2 kubenswrapper[4762]: I1014 13:19:58.531103 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" oldPodUID="f26cf13b1c8c4f1b57c0ac506ef256a4" podUID="09a1584aa5985a5ff9600248bcf73e77" Oct 14 13:19:58.639602 master-2 kubenswrapper[4762]: I1014 13:19:58.639529 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir\") pod \"f26cf13b1c8c4f1b57c0ac506ef256a4\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " Oct 14 13:19:58.640035 master-2 kubenswrapper[4762]: I1014 13:19:58.639666 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir\") pod \"f26cf13b1c8c4f1b57c0ac506ef256a4\" (UID: \"f26cf13b1c8c4f1b57c0ac506ef256a4\") " Oct 14 13:19:58.640035 master-2 kubenswrapper[4762]: I1014 13:19:58.639657 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f26cf13b1c8c4f1b57c0ac506ef256a4" (UID: "f26cf13b1c8c4f1b57c0ac506ef256a4"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:58.640035 master-2 kubenswrapper[4762]: I1014 13:19:58.639832 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f26cf13b1c8c4f1b57c0ac506ef256a4" (UID: "f26cf13b1c8c4f1b57c0ac506ef256a4"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:58.640130 master-2 kubenswrapper[4762]: I1014 13:19:58.640076 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:58.640130 master-2 kubenswrapper[4762]: I1014 13:19:58.640103 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f26cf13b1c8c4f1b57c0ac506ef256a4-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:58.847605 master-2 kubenswrapper[4762]: I1014 13:19:58.847539 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:19:58.847928 master-2 kubenswrapper[4762]: I1014 13:19:58.847638 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:19:59.076420 master-2 kubenswrapper[4762]: I1014 13:19:59.076368 4762 generic.go:334] "Generic (PLEG): container finished" podID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerID="86954708c4083ce02b3287f821780b8c962df87887fa2a2204ed39142954e4f0" exitCode=0 Oct 14 13:19:59.076591 master-2 kubenswrapper[4762]: I1014 13:19:59.076421 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" event={"ID":"31803cc5-bd42-4bb2-8872-79acd1f79d5b","Type":"ContainerDied","Data":"86954708c4083ce02b3287f821780b8c962df87887fa2a2204ed39142954e4f0"} Oct 14 13:19:59.076591 master-2 kubenswrapper[4762]: I1014 13:19:59.076496 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" event={"ID":"31803cc5-bd42-4bb2-8872-79acd1f79d5b","Type":"ContainerDied","Data":"806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8"} Oct 14 13:19:59.076591 master-2 kubenswrapper[4762]: I1014 13:19:59.076520 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806dc5a17ae69fc645fc222de2751bffc775ea4eb434a450b1b469270f16f0e8" Oct 14 13:19:59.078084 master-2 kubenswrapper[4762]: I1014 13:19:59.078049 4762 generic.go:334] "Generic (PLEG): container finished" podID="9cd92a0e-7d77-4c47-82ef-98aefb24c268" containerID="7cd64116c28c9e715c389053cb78bc23f54f32883375f9a31eacad651b6d8a63" exitCode=0 Oct 14 13:19:59.078213 master-2 kubenswrapper[4762]: I1014 13:19:59.078112 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-2" event={"ID":"9cd92a0e-7d77-4c47-82ef-98aefb24c268","Type":"ContainerDied","Data":"7cd64116c28c9e715c389053cb78bc23f54f32883375f9a31eacad651b6d8a63"} Oct 14 13:19:59.081329 master-2 kubenswrapper[4762]: I1014 13:19:59.081112 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-2_f26cf13b1c8c4f1b57c0ac506ef256a4/kube-scheduler-cert-syncer/0.log" Oct 14 13:19:59.081698 master-2 kubenswrapper[4762]: I1014 13:19:59.081657 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:19:59.081817 master-2 kubenswrapper[4762]: I1014 13:19:59.081798 4762 generic.go:334] "Generic (PLEG): container finished" podID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerID="2e6e16be70c882739cd9ce4e47fdaafa968150fcb22d8ff2377d2049b8b3beef" exitCode=0 Oct 14 13:19:59.081817 master-2 kubenswrapper[4762]: I1014 13:19:59.081816 4762 generic.go:334] "Generic (PLEG): container finished" podID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerID="be6e28bd40aab1a8ebbfd27ebfe4eb591ba0b55eae92170c1b9f8a0a8a41306c" exitCode=2 Oct 14 13:19:59.081954 master-2 kubenswrapper[4762]: I1014 13:19:59.081827 4762 generic.go:334] "Generic (PLEG): container finished" podID="f26cf13b1c8c4f1b57c0ac506ef256a4" containerID="0da1f39434f36e2d44c9684fdf09ed6e485b933407a1476b3fb79bad430550c2" exitCode=0 Oct 14 13:19:59.081954 master-2 kubenswrapper[4762]: I1014 13:19:59.081858 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e592b1bf9f432b9d2baa935fba33de5c17bffc7cb259c1dcdc83e0eb734bd8a" Oct 14 13:19:59.081954 master-2 kubenswrapper[4762]: I1014 13:19:59.081923 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:19:59.146284 master-2 kubenswrapper[4762]: I1014 13:19:59.146141 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146666 master-2 kubenswrapper[4762]: I1014 13:19:59.146385 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146666 master-2 kubenswrapper[4762]: I1014 13:19:59.146447 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146666 master-2 kubenswrapper[4762]: I1014 13:19:59.146566 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rvb5\" (UniqueName: \"kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146937 master-2 kubenswrapper[4762]: I1014 13:19:59.146665 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146937 master-2 kubenswrapper[4762]: I1014 13:19:59.146725 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146937 master-2 kubenswrapper[4762]: I1014 13:19:59.146768 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.146937 master-2 kubenswrapper[4762]: I1014 13:19:59.146841 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies\") pod \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\" (UID: \"31803cc5-bd42-4bb2-8872-79acd1f79d5b\") " Oct 14 13:19:59.148881 master-2 kubenswrapper[4762]: I1014 13:19:59.147295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:19:59.148881 master-2 kubenswrapper[4762]: I1014 13:19:59.147810 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:59.148881 master-2 kubenswrapper[4762]: I1014 13:19:59.148770 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:59.148881 master-2 kubenswrapper[4762]: I1014 13:19:59.148810 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:19:59.152295 master-2 kubenswrapper[4762]: I1014 13:19:59.152238 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:59.154532 master-2 kubenswrapper[4762]: I1014 13:19:59.154442 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:59.156567 master-2 kubenswrapper[4762]: I1014 13:19:59.156483 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:19:59.156567 master-2 kubenswrapper[4762]: I1014 13:19:59.156540 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" oldPodUID="f26cf13b1c8c4f1b57c0ac506ef256a4" podUID="09a1584aa5985a5ff9600248bcf73e77" Oct 14 13:19:59.166270 master-2 kubenswrapper[4762]: I1014 13:19:59.159700 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5" (OuterVolumeSpecName: "kube-api-access-2rvb5") pod "31803cc5-bd42-4bb2-8872-79acd1f79d5b" (UID: "31803cc5-bd42-4bb2-8872-79acd1f79d5b"). InnerVolumeSpecName "kube-api-access-2rvb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248238 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248283 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248293 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248306 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-audit-policies\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248315 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31803cc5-bd42-4bb2-8872-79acd1f79d5b-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248327 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248304 master-2 kubenswrapper[4762]: I1014 13:19:59.248336 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31803cc5-bd42-4bb2-8872-79acd1f79d5b-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.248890 master-2 kubenswrapper[4762]: I1014 13:19:59.248346 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rvb5\" (UniqueName: \"kubernetes.io/projected/31803cc5-bd42-4bb2-8872-79acd1f79d5b-kube-api-access-2rvb5\") on node \"master-2\" DevicePath \"\"" Oct 14 13:19:59.560915 master-2 kubenswrapper[4762]: I1014 13:19:59.560850 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26cf13b1c8c4f1b57c0ac506ef256a4" path="/var/lib/kubelet/pods/f26cf13b1c8c4f1b57c0ac506ef256a4/volumes" Oct 14 13:20:00.089667 master-2 kubenswrapper[4762]: I1014 13:20:00.088701 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp" Oct 14 13:20:00.125374 master-2 kubenswrapper[4762]: I1014 13:20:00.125303 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:20:00.135402 master-2 kubenswrapper[4762]: I1014 13:20:00.135351 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-l7lmp"] Oct 14 13:20:00.438591 master-2 kubenswrapper[4762]: I1014 13:20:00.438551 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:20:00.464965 master-2 kubenswrapper[4762]: I1014 13:20:00.464915 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock\") pod \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " Oct 14 13:20:00.465129 master-2 kubenswrapper[4762]: I1014 13:20:00.464983 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access\") pod \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " Oct 14 13:20:00.465129 master-2 kubenswrapper[4762]: I1014 13:20:00.465016 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir\") pod \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\" (UID: \"9cd92a0e-7d77-4c47-82ef-98aefb24c268\") " Oct 14 13:20:00.465129 master-2 kubenswrapper[4762]: I1014 13:20:00.465028 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock" (OuterVolumeSpecName: "var-lock") pod "9cd92a0e-7d77-4c47-82ef-98aefb24c268" (UID: "9cd92a0e-7d77-4c47-82ef-98aefb24c268"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:20:00.465337 master-2 kubenswrapper[4762]: I1014 13:20:00.465292 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9cd92a0e-7d77-4c47-82ef-98aefb24c268" (UID: "9cd92a0e-7d77-4c47-82ef-98aefb24c268"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:20:00.465408 master-2 kubenswrapper[4762]: I1014 13:20:00.465317 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:20:00.469944 master-2 kubenswrapper[4762]: I1014 13:20:00.469594 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9cd92a0e-7d77-4c47-82ef-98aefb24c268" (UID: "9cd92a0e-7d77-4c47-82ef-98aefb24c268"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:20:00.566742 master-2 kubenswrapper[4762]: I1014 13:20:00.566674 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:20:00.566742 master-2 kubenswrapper[4762]: I1014 13:20:00.566735 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cd92a0e-7d77-4c47-82ef-98aefb24c268-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:20:01.095729 master-2 kubenswrapper[4762]: I1014 13:20:01.095658 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-2" event={"ID":"9cd92a0e-7d77-4c47-82ef-98aefb24c268","Type":"ContainerDied","Data":"b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618"} Oct 14 13:20:01.095729 master-2 kubenswrapper[4762]: I1014 13:20:01.095726 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0010cb31080f32b1393e6d2e6dc353fc2d22dcb440b7b7f5328e279b63ff618" Oct 14 13:20:01.096535 master-2 kubenswrapper[4762]: I1014 13:20:01.095767 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-2" Oct 14 13:20:01.542348 master-2 kubenswrapper[4762]: I1014 13:20:01.542287 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:20:01.542620 master-2 kubenswrapper[4762]: I1014 13:20:01.542582 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" containerID="cri-o://eb59a26421ed95409972305df3c5daa73b20d2bede001f3c9ed71c3c125f3dc5" gracePeriod=120 Oct 14 13:20:01.542724 master-2 kubenswrapper[4762]: I1014 13:20:01.542671 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://475c7fe94c7689d429a499bdcf69b6cc227826fdedac1115b9592805f384a109" gracePeriod=120 Oct 14 13:20:01.555461 master-2 kubenswrapper[4762]: I1014 13:20:01.555390 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" path="/var/lib/kubelet/pods/31803cc5-bd42-4bb2-8872-79acd1f79d5b/volumes" Oct 14 13:20:01.605566 master-2 kubenswrapper[4762]: I1014 13:20:01.605491 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:01.605787 master-2 kubenswrapper[4762]: I1014 13:20:01.605587 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: I1014 13:20:02.085499 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:02.085568 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:02.086306 master-2 kubenswrapper[4762]: I1014 13:20:02.085569 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:02.102647 master-2 kubenswrapper[4762]: I1014 13:20:02.102594 4762 generic.go:334] "Generic (PLEG): container finished" podID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerID="475c7fe94c7689d429a499bdcf69b6cc227826fdedac1115b9592805f384a109" exitCode=0 Oct 14 13:20:02.102647 master-2 kubenswrapper[4762]: I1014 13:20:02.102647 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerDied","Data":"475c7fe94c7689d429a499bdcf69b6cc227826fdedac1115b9592805f384a109"} Oct 14 13:20:03.624596 master-2 kubenswrapper[4762]: I1014 13:20:03.624454 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:03.625280 master-2 kubenswrapper[4762]: I1014 13:20:03.624639 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:03.625280 master-2 kubenswrapper[4762]: I1014 13:20:03.624657 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:03.625280 master-2 kubenswrapper[4762]: I1014 13:20:03.624670 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:03.625280 master-2 kubenswrapper[4762]: I1014 13:20:03.624768 4762 patch_prober.go:28] interesting pod/kube-controller-manager-master-2 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:20:03.625280 master-2 kubenswrapper[4762]: I1014 13:20:03.624809 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:20:03.628686 master-2 kubenswrapper[4762]: I1014 13:20:03.628649 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:03.847574 master-2 kubenswrapper[4762]: I1014 13:20:03.847473 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:20:03.847967 master-2 kubenswrapper[4762]: I1014 13:20:03.847569 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:20:04.119522 master-2 kubenswrapper[4762]: I1014 13:20:04.119474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:06.607057 master-2 kubenswrapper[4762]: I1014 13:20:06.606926 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:06.607909 master-2 kubenswrapper[4762]: I1014 13:20:06.607066 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: I1014 13:20:07.084424 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:07.084470 master-2 kubenswrapper[4762]: I1014 13:20:07.084475 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:08.854728 master-2 kubenswrapper[4762]: I1014 13:20:08.854667 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:20:11.604720 master-2 kubenswrapper[4762]: I1014 13:20:11.604620 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:11.605911 master-2 kubenswrapper[4762]: I1014 13:20:11.604727 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:11.605911 master-2 kubenswrapper[4762]: I1014 13:20:11.604817 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: I1014 13:20:12.085395 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:12.085507 master-2 kubenswrapper[4762]: I1014 13:20:12.085497 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:12.087717 master-2 kubenswrapper[4762]: I1014 13:20:12.085604 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:20:12.162052 master-2 kubenswrapper[4762]: I1014 13:20:12.161957 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:12.162324 master-2 kubenswrapper[4762]: I1014 13:20:12.162075 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:12.277825 master-2 kubenswrapper[4762]: I1014 13:20:12.277762 4762 scope.go:117] "RemoveContainer" containerID="d7e365eb8e01212deb310e30a49986c32ca5d3a702e94995aab2751ca5e8f908" Oct 14 13:20:12.296182 master-2 kubenswrapper[4762]: I1014 13:20:12.296125 4762 scope.go:117] "RemoveContainer" containerID="0da1f39434f36e2d44c9684fdf09ed6e485b933407a1476b3fb79bad430550c2" Oct 14 13:20:12.315844 master-2 kubenswrapper[4762]: I1014 13:20:12.315761 4762 scope.go:117] "RemoveContainer" containerID="67e3eecae682a65c0dea3a2495e130d1fb9f92e0a4de76a1793c299e38cffbf0" Oct 14 13:20:12.329198 master-2 kubenswrapper[4762]: I1014 13:20:12.329141 4762 scope.go:117] "RemoveContainer" containerID="e45f2b094e6806403cfa8da2bf527b04ff4b8ae6e1a18580c31fcc2301b38ee9" Oct 14 13:20:12.343184 master-2 kubenswrapper[4762]: I1014 13:20:12.343138 4762 scope.go:117] "RemoveContainer" containerID="2e6e16be70c882739cd9ce4e47fdaafa968150fcb22d8ff2377d2049b8b3beef" Oct 14 13:20:12.358689 master-2 kubenswrapper[4762]: I1014 13:20:12.358646 4762 scope.go:117] "RemoveContainer" containerID="be6e28bd40aab1a8ebbfd27ebfe4eb591ba0b55eae92170c1b9f8a0a8a41306c" Oct 14 13:20:12.372049 master-2 kubenswrapper[4762]: I1014 13:20:12.371962 4762 scope.go:117] "RemoveContainer" containerID="8fcbe4e7b616f9e023c4c6ab447171662a89a0f5f8d78ea5663544b61a533bff" Oct 14 13:20:12.548147 master-2 kubenswrapper[4762]: I1014 13:20:12.548077 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:20:12.562860 master-2 kubenswrapper[4762]: I1014 13:20:12.562804 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podUID="ac561e76-94f5-4f99-a415-034f423e34a9" Oct 14 13:20:12.562860 master-2 kubenswrapper[4762]: I1014 13:20:12.562843 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podUID="ac561e76-94f5-4f99-a415-034f423e34a9" Oct 14 13:20:12.595600 master-2 kubenswrapper[4762]: I1014 13:20:12.595396 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:20:12.596411 master-2 kubenswrapper[4762]: I1014 13:20:12.588546 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:20:12.596538 master-2 kubenswrapper[4762]: I1014 13:20:12.596460 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:20:12.619863 master-2 kubenswrapper[4762]: I1014 13:20:12.619787 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:20:12.622712 master-2 kubenswrapper[4762]: I1014 13:20:12.622648 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-2"] Oct 14 13:20:12.648426 master-2 kubenswrapper[4762]: W1014 13:20:12.648288 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a1584aa5985a5ff9600248bcf73e77.slice/crio-7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9 WatchSource:0}: Error finding container 7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9: Status 404 returned error can't find the container with id 7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9 Oct 14 13:20:13.171748 master-2 kubenswrapper[4762]: I1014 13:20:13.171636 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681"} Oct 14 13:20:13.171956 master-2 kubenswrapper[4762]: I1014 13:20:13.171749 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9"} Oct 14 13:20:13.627778 master-2 kubenswrapper[4762]: I1014 13:20:13.627661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:13.630956 master-2 kubenswrapper[4762]: I1014 13:20:13.630907 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:20:16.604634 master-2 kubenswrapper[4762]: I1014 13:20:16.604570 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:16.604634 master-2 kubenswrapper[4762]: I1014 13:20:16.604631 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: I1014 13:20:17.086966 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:17.087059 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:17.088352 master-2 kubenswrapper[4762]: I1014 13:20:17.087059 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:21.604837 master-2 kubenswrapper[4762]: I1014 13:20:21.604748 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:21.604837 master-2 kubenswrapper[4762]: I1014 13:20:21.604808 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: I1014 13:20:22.085694 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:22.085765 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:22.086747 master-2 kubenswrapper[4762]: I1014 13:20:22.085779 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:26.605725 master-2 kubenswrapper[4762]: I1014 13:20:26.605644 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:26.606633 master-2 kubenswrapper[4762]: I1014 13:20:26.605726 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: I1014 13:20:27.084289 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:27.084334 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:27.085265 master-2 kubenswrapper[4762]: I1014 13:20:27.085235 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:31.605808 master-2 kubenswrapper[4762]: I1014 13:20:31.605725 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:31.605808 master-2 kubenswrapper[4762]: I1014 13:20:31.605798 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: I1014 13:20:32.087481 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:32.087595 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:32.088940 master-2 kubenswrapper[4762]: I1014 13:20:32.087597 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:36.605050 master-2 kubenswrapper[4762]: I1014 13:20:36.604934 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:36.605050 master-2 kubenswrapper[4762]: I1014 13:20:36.605022 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: I1014 13:20:37.085285 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:37.085340 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:37.086889 master-2 kubenswrapper[4762]: I1014 13:20:37.086426 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:41.605310 master-2 kubenswrapper[4762]: I1014 13:20:41.605045 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:41.605310 master-2 kubenswrapper[4762]: I1014 13:20:41.605128 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: I1014 13:20:42.087953 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:42.088050 master-2 kubenswrapper[4762]: I1014 13:20:42.088043 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:43.380760 master-2 kubenswrapper[4762]: I1014 13:20:43.380680 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-2_09a1584aa5985a5ff9600248bcf73e77/wait-for-host-port/0.log" Oct 14 13:20:43.380760 master-2 kubenswrapper[4762]: I1014 13:20:43.380745 4762 generic.go:334] "Generic (PLEG): container finished" podID="09a1584aa5985a5ff9600248bcf73e77" containerID="b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681" exitCode=124 Oct 14 13:20:43.381841 master-2 kubenswrapper[4762]: I1014 13:20:43.380782 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerDied","Data":"b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681"} Oct 14 13:20:44.388782 master-2 kubenswrapper[4762]: I1014 13:20:44.388715 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-2_09a1584aa5985a5ff9600248bcf73e77/wait-for-host-port/0.log" Oct 14 13:20:44.388782 master-2 kubenswrapper[4762]: I1014 13:20:44.388781 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"75787b0868819b53b1e7732247decd01003dfe7c8dcfdf405433d25fb5866fa1"} Oct 14 13:20:46.604981 master-2 kubenswrapper[4762]: I1014 13:20:46.604885 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:46.604981 master-2 kubenswrapper[4762]: I1014 13:20:46.604949 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: I1014 13:20:47.087032 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:20:47.087101 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:20:47.088181 master-2 kubenswrapper[4762]: I1014 13:20:47.087104 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:20:51.605911 master-2 kubenswrapper[4762]: I1014 13:20:51.605833 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:51.606675 master-2 kubenswrapper[4762]: I1014 13:20:51.605962 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:52.080728 master-2 kubenswrapper[4762]: I1014 13:20:52.080615 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:20:52.080728 master-2 kubenswrapper[4762]: I1014 13:20:52.080709 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:20:56.605054 master-2 kubenswrapper[4762]: I1014 13:20:56.604935 4762 patch_prober.go:28] interesting pod/openshift-kube-scheduler-guard-master-2 container/guard namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" start-of-body= Oct 14 13:20:56.605054 master-2 kubenswrapper[4762]: I1014 13:20:56.605022 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" podUID="da145675-d789-46f4-8036-694602b5efd6" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10259/healthz\": dial tcp 192.168.34.12:10259: connect: connection refused" Oct 14 13:20:57.081202 master-2 kubenswrapper[4762]: I1014 13:20:57.081040 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:20:57.081691 master-2 kubenswrapper[4762]: I1014 13:20:57.081134 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:20:59.509903 master-2 kubenswrapper[4762]: I1014 13:20:59.509865 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-2_09a1584aa5985a5ff9600248bcf73e77/wait-for-host-port/0.log" Oct 14 13:20:59.510900 master-2 kubenswrapper[4762]: I1014 13:20:59.510715 4762 generic.go:334] "Generic (PLEG): container finished" podID="09a1584aa5985a5ff9600248bcf73e77" containerID="75787b0868819b53b1e7732247decd01003dfe7c8dcfdf405433d25fb5866fa1" exitCode=0 Oct 14 13:20:59.510900 master-2 kubenswrapper[4762]: I1014 13:20:59.510759 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerDied","Data":"75787b0868819b53b1e7732247decd01003dfe7c8dcfdf405433d25fb5866fa1"} Oct 14 13:20:59.510900 master-2 kubenswrapper[4762]: I1014 13:20:59.510799 4762 scope.go:117] "RemoveContainer" containerID="b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681" Oct 14 13:20:59.511394 master-2 kubenswrapper[4762]: I1014 13:20:59.511334 4762 scope.go:117] "RemoveContainer" containerID="b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681" Oct 14 13:20:59.547869 master-2 kubenswrapper[4762]: E1014 13:20:59.547821 4762 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_wait-for-host-port_openshift-kube-scheduler-master-2_openshift-kube-scheduler_09a1584aa5985a5ff9600248bcf73e77_0 in pod sandbox 7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9 from index: no such id: 'b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681'" containerID="b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681" Oct 14 13:20:59.548080 master-2 kubenswrapper[4762]: E1014 13:20:59.548057 4762 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"wait-for-host-port\": rpc error: code = Unknown desc = failed to delete container k8s_wait-for-host-port_openshift-kube-scheduler-master-2_openshift-kube-scheduler_09a1584aa5985a5ff9600248bcf73e77_0 in pod sandbox 7f5d8b85848e48418477bcfb3b3faa6f92c82317dde0ec7b073576a9ab0bc8a9 from index: no such id: 'b2bd7d7095069929e3638fc601c364183d32967e9d6625e7f2f7e213ec3a6681'; Skipping pod \"openshift-kube-scheduler-master-2_openshift-kube-scheduler(09a1584aa5985a5ff9600248bcf73e77)\"" logger="UnhandledError" Oct 14 13:21:00.521789 master-2 kubenswrapper[4762]: I1014 13:21:00.520648 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"953ed57daadd54c72bbd6ce9b7f28d82ec2adce5d9b648952a2bbe32e22e6f8f"} Oct 14 13:21:00.521789 master-2 kubenswrapper[4762]: I1014 13:21:00.520702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"14d1cc22359b0bd37f84631a2302bee34e34b4f03f5a618681d65d1f513e0c60"} Oct 14 13:21:00.521789 master-2 kubenswrapper[4762]: I1014 13:21:00.520719 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" event={"ID":"09a1584aa5985a5ff9600248bcf73e77","Type":"ContainerStarted","Data":"c688475b54a62d8cf2d7af1fbea482257afe9ff1956babc3b4c0e1815f6c0d66"} Oct 14 13:21:00.521789 master-2 kubenswrapper[4762]: I1014 13:21:00.521739 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:21:01.611767 master-2 kubenswrapper[4762]: I1014 13:21:01.611694 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-guard-master-2" Oct 14 13:21:01.641378 master-2 kubenswrapper[4762]: I1014 13:21:01.641272 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" podStartSLOduration=49.641243266000004 podStartE2EDuration="49.641243266s" podCreationTimestamp="2025-10-14 13:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:00.553868459 +0000 UTC m=+889.798027668" watchObservedRunningTime="2025-10-14 13:21:01.641243266 +0000 UTC m=+890.885402465" Oct 14 13:21:02.081027 master-2 kubenswrapper[4762]: I1014 13:21:02.080928 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:02.081348 master-2 kubenswrapper[4762]: I1014 13:21:02.081029 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:07.081116 master-2 kubenswrapper[4762]: I1014 13:21:07.081021 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:07.082194 master-2 kubenswrapper[4762]: I1014 13:21:07.081123 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:12.080638 master-2 kubenswrapper[4762]: I1014 13:21:12.080538 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:12.081186 master-2 kubenswrapper[4762]: I1014 13:21:12.080644 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:12.406818 master-2 kubenswrapper[4762]: I1014 13:21:12.406679 4762 scope.go:117] "RemoveContainer" containerID="86954708c4083ce02b3287f821780b8c962df87887fa2a2204ed39142954e4f0" Oct 14 13:21:12.428955 master-2 kubenswrapper[4762]: I1014 13:21:12.428890 4762 scope.go:117] "RemoveContainer" containerID="17d44eb9784edc26e702ce1deec1a2332094dca3f442de74ff4d90b44b112b27" Oct 14 13:21:17.080874 master-2 kubenswrapper[4762]: I1014 13:21:17.080715 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:17.085941 master-2 kubenswrapper[4762]: I1014 13:21:17.080875 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:22.080779 master-2 kubenswrapper[4762]: I1014 13:21:22.080689 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:22.081516 master-2 kubenswrapper[4762]: I1014 13:21:22.080783 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:27.080727 master-2 kubenswrapper[4762]: I1014 13:21:27.080598 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:27.081290 master-2 kubenswrapper[4762]: I1014 13:21:27.080743 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:32.081685 master-2 kubenswrapper[4762]: I1014 13:21:32.081582 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:32.081685 master-2 kubenswrapper[4762]: I1014 13:21:32.081670 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:37.080626 master-2 kubenswrapper[4762]: I1014 13:21:37.080454 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:37.080626 master-2 kubenswrapper[4762]: I1014 13:21:37.080575 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:42.081127 master-2 kubenswrapper[4762]: I1014 13:21:42.081039 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:42.082117 master-2 kubenswrapper[4762]: I1014 13:21:42.081122 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:46.291439 master-2 kubenswrapper[4762]: I1014 13:21:46.291399 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:21:46.292180 master-2 kubenswrapper[4762]: E1014 13:21:46.292142 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd92a0e-7d77-4c47-82ef-98aefb24c268" containerName="installer" Oct 14 13:21:46.292255 master-2 kubenswrapper[4762]: I1014 13:21:46.292244 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd92a0e-7d77-4c47-82ef-98aefb24c268" containerName="installer" Oct 14 13:21:46.292347 master-2 kubenswrapper[4762]: E1014 13:21:46.292337 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" Oct 14 13:21:46.292437 master-2 kubenswrapper[4762]: I1014 13:21:46.292398 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" Oct 14 13:21:46.292547 master-2 kubenswrapper[4762]: E1014 13:21:46.292537 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="fix-audit-permissions" Oct 14 13:21:46.292603 master-2 kubenswrapper[4762]: I1014 13:21:46.292594 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="fix-audit-permissions" Oct 14 13:21:46.292726 master-2 kubenswrapper[4762]: I1014 13:21:46.292715 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="31803cc5-bd42-4bb2-8872-79acd1f79d5b" containerName="oauth-apiserver" Oct 14 13:21:46.292792 master-2 kubenswrapper[4762]: I1014 13:21:46.292783 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd92a0e-7d77-4c47-82ef-98aefb24c268" containerName="installer" Oct 14 13:21:46.293476 master-2 kubenswrapper[4762]: I1014 13:21:46.293457 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.296800 master-2 kubenswrapper[4762]: I1014 13:21:46.296368 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-8gpjk" Oct 14 13:21:46.297510 master-2 kubenswrapper[4762]: I1014 13:21:46.297285 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 14 13:21:46.297635 master-2 kubenswrapper[4762]: I1014 13:21:46.297513 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 14 13:21:46.297676 master-2 kubenswrapper[4762]: I1014 13:21:46.297623 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 14 13:21:46.297753 master-2 kubenswrapper[4762]: I1014 13:21:46.297646 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 14 13:21:46.297753 master-2 kubenswrapper[4762]: I1014 13:21:46.297722 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:21:46.297824 master-2 kubenswrapper[4762]: I1014 13:21:46.297623 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 14 13:21:46.297824 master-2 kubenswrapper[4762]: I1014 13:21:46.297624 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 14 13:21:46.298928 master-2 kubenswrapper[4762]: I1014 13:21:46.298227 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 14 13:21:46.307215 master-2 kubenswrapper[4762]: I1014 13:21:46.307111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307215 master-2 kubenswrapper[4762]: I1014 13:21:46.307181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307215 master-2 kubenswrapper[4762]: I1014 13:21:46.307206 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307630 master-2 kubenswrapper[4762]: I1014 13:21:46.307234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307630 master-2 kubenswrapper[4762]: I1014 13:21:46.307256 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307630 master-2 kubenswrapper[4762]: I1014 13:21:46.307275 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7bzj\" (UniqueName: \"kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307630 master-2 kubenswrapper[4762]: I1014 13:21:46.307307 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.307630 master-2 kubenswrapper[4762]: I1014 13:21:46.307328 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.315431 master-2 kubenswrapper[4762]: I1014 13:21:46.315384 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:21:46.340212 master-2 kubenswrapper[4762]: I1014 13:21:46.340110 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:21:46.341388 master-2 kubenswrapper[4762]: I1014 13:21:46.341338 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.344174 master-2 kubenswrapper[4762]: I1014 13:21:46.344109 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:21:46.344996 master-2 kubenswrapper[4762]: I1014 13:21:46.344977 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:21:46.345209 master-2 kubenswrapper[4762]: I1014 13:21:46.345152 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-2zbrt" Oct 14 13:21:46.345271 master-2 kubenswrapper[4762]: I1014 13:21:46.345211 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.345271 master-2 kubenswrapper[4762]: I1014 13:21:46.345228 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:21:46.345395 master-2 kubenswrapper[4762]: I1014 13:21:46.345344 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:21:46.346018 master-2 kubenswrapper[4762]: I1014 13:21:46.346002 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:21:46.346219 master-2 kubenswrapper[4762]: I1014 13:21:46.346183 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:21:46.348696 master-2 kubenswrapper[4762]: I1014 13:21:46.348630 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:21:46.349084 master-2 kubenswrapper[4762]: I1014 13:21:46.349045 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:21:46.349470 master-2 kubenswrapper[4762]: I1014 13:21:46.349408 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:21:46.349794 master-2 kubenswrapper[4762]: I1014 13:21:46.349776 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-t6l59" Oct 14 13:21:46.349962 master-2 kubenswrapper[4762]: I1014 13:21:46.349935 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:21:46.350048 master-2 kubenswrapper[4762]: I1014 13:21:46.349807 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:21:46.357320 master-2 kubenswrapper[4762]: I1014 13:21:46.357294 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:21:46.374687 master-2 kubenswrapper[4762]: I1014 13:21:46.374655 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:21:46.383271 master-2 kubenswrapper[4762]: I1014 13:21:46.383199 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:21:46.408263 master-2 kubenswrapper[4762]: I1014 13:21:46.408133 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408263 master-2 kubenswrapper[4762]: I1014 13:21:46.408221 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408263 master-2 kubenswrapper[4762]: I1014 13:21:46.408265 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408558 master-2 kubenswrapper[4762]: I1014 13:21:46.408305 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408558 master-2 kubenswrapper[4762]: I1014 13:21:46.408330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408558 master-2 kubenswrapper[4762]: I1014 13:21:46.408374 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408558 master-2 kubenswrapper[4762]: I1014 13:21:46.408401 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408558 master-2 kubenswrapper[4762]: I1014 13:21:46.408432 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7bzj\" (UniqueName: \"kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.408913 master-2 kubenswrapper[4762]: I1014 13:21:46.408582 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.409226 master-2 kubenswrapper[4762]: I1014 13:21:46.409184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.411921 master-2 kubenswrapper[4762]: I1014 13:21:46.410079 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.411921 master-2 kubenswrapper[4762]: I1014 13:21:46.410631 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.411921 master-2 kubenswrapper[4762]: I1014 13:21:46.411803 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.412292 master-2 kubenswrapper[4762]: I1014 13:21:46.412240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.414475 master-2 kubenswrapper[4762]: I1014 13:21:46.414435 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.426919 master-2 kubenswrapper[4762]: I1014 13:21:46.424466 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-76c4979bdc-gds6w"] Oct 14 13:21:46.426919 master-2 kubenswrapper[4762]: I1014 13:21:46.425135 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.428264 master-2 kubenswrapper[4762]: I1014 13:21:46.428226 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Oct 14 13:21:46.428496 master-2 kubenswrapper[4762]: I1014 13:21:46.428425 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3mlibjliosje2" Oct 14 13:21:46.428566 master-2 kubenswrapper[4762]: I1014 13:21:46.428544 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Oct 14 13:21:46.428644 master-2 kubenswrapper[4762]: I1014 13:21:46.428564 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Oct 14 13:21:46.428700 master-2 kubenswrapper[4762]: I1014 13:21:46.428655 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Oct 14 13:21:46.429351 master-2 kubenswrapper[4762]: I1014 13:21:46.429312 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:21:46.430204 master-2 kubenswrapper[4762]: I1014 13:21:46.430180 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.431541 master-2 kubenswrapper[4762]: I1014 13:21:46.431262 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-bg9tg" Oct 14 13:21:46.433716 master-2 kubenswrapper[4762]: I1014 13:21:46.433035 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 14 13:21:46.433716 master-2 kubenswrapper[4762]: I1014 13:21:46.433139 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 14 13:21:46.436170 master-2 kubenswrapper[4762]: I1014 13:21:46.436120 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 14 13:21:46.436295 master-2 kubenswrapper[4762]: I1014 13:21:46.436196 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-5vqgl" Oct 14 13:21:46.436295 master-2 kubenswrapper[4762]: I1014 13:21:46.436254 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 14 13:21:46.436295 master-2 kubenswrapper[4762]: I1014 13:21:46.436265 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 14 13:21:46.436565 master-2 kubenswrapper[4762]: I1014 13:21:46.436200 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 14 13:21:46.436565 master-2 kubenswrapper[4762]: I1014 13:21:46.436329 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 14 13:21:46.438355 master-2 kubenswrapper[4762]: I1014 13:21:46.436925 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 14 13:21:46.438355 master-2 kubenswrapper[4762]: I1014 13:21:46.437006 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 14 13:21:46.438355 master-2 kubenswrapper[4762]: I1014 13:21:46.437236 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 14 13:21:46.443700 master-2 kubenswrapper[4762]: I1014 13:21:46.439573 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 14 13:21:46.458459 master-2 kubenswrapper[4762]: I1014 13:21:46.454271 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6768b5f5f9-6l8p6"] Oct 14 13:21:46.458459 master-2 kubenswrapper[4762]: I1014 13:21:46.455106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.461863 master-2 kubenswrapper[4762]: I1014 13:21:46.461181 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 14 13:21:46.461863 master-2 kubenswrapper[4762]: I1014 13:21:46.461765 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-76c4979bdc-gds6w"] Oct 14 13:21:46.462952 master-2 kubenswrapper[4762]: I1014 13:21:46.462860 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-hb2t6" Oct 14 13:21:46.462952 master-2 kubenswrapper[4762]: I1014 13:21:46.462927 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 14 13:21:46.463258 master-2 kubenswrapper[4762]: I1014 13:21:46.463037 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 14 13:21:46.463258 master-2 kubenswrapper[4762]: I1014 13:21:46.462860 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 14 13:21:46.464511 master-2 kubenswrapper[4762]: I1014 13:21:46.464492 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 14 13:21:46.465451 master-2 kubenswrapper[4762]: I1014 13:21:46.465421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7bzj\" (UniqueName: \"kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj\") pod \"apiserver-84c8b8d745-p4css\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.465538 master-2 kubenswrapper[4762]: I1014 13:21:46.465482 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 14 13:21:46.466021 master-2 kubenswrapper[4762]: I1014 13:21:46.465988 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:21:46.470247 master-2 kubenswrapper[4762]: I1014 13:21:46.469871 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 14 13:21:46.472404 master-2 kubenswrapper[4762]: I1014 13:21:46.472371 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh"] Oct 14 13:21:46.473084 master-2 kubenswrapper[4762]: I1014 13:21:46.473049 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:46.476961 master-2 kubenswrapper[4762]: I1014 13:21:46.475319 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-qhnj6" Oct 14 13:21:46.476961 master-2 kubenswrapper[4762]: I1014 13:21:46.475329 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Oct 14 13:21:46.488221 master-2 kubenswrapper[4762]: I1014 13:21:46.486246 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6768b5f5f9-6l8p6"] Oct 14 13:21:46.501409 master-2 kubenswrapper[4762]: I1014 13:21:46.501347 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr"] Oct 14 13:21:46.502352 master-2 kubenswrapper[4762]: I1014 13:21:46.502320 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.504577 master-2 kubenswrapper[4762]: I1014 13:21:46.504542 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d"] Oct 14 13:21:46.504786 master-2 kubenswrapper[4762]: I1014 13:21:46.504749 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-l545q" Oct 14 13:21:46.506518 master-2 kubenswrapper[4762]: I1014 13:21:46.506467 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.508603 master-2 kubenswrapper[4762]: I1014 13:21:46.508575 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8fg56"] Oct 14 13:21:46.509302 master-2 kubenswrapper[4762]: I1014 13:21:46.509285 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510307 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510386 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk5d\" (UniqueName: \"kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjvk\" (UniqueName: \"kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510505 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510568 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.510605 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.512282 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.512464 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.513408 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.513697 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-svq88" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.513410 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-68d7l" Oct 14 13:21:46.514595 master-2 kubenswrapper[4762]: I1014 13:21:46.514414 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 14 13:21:46.516047 master-2 kubenswrapper[4762]: I1014 13:21:46.515594 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 14 13:21:46.518910 master-2 kubenswrapper[4762]: I1014 13:21:46.518880 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh"] Oct 14 13:21:46.521633 master-2 kubenswrapper[4762]: I1014 13:21:46.521589 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d"] Oct 14 13:21:46.523813 master-2 kubenswrapper[4762]: I1014 13:21:46.523798 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr"] Oct 14 13:21:46.608043 master-2 kubenswrapper[4762]: I1014 13:21:46.607976 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:46.612256 master-2 kubenswrapper[4762]: I1014 13:21:46.612212 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjvk\" (UniqueName: \"kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.612328 master-2 kubenswrapper[4762]: I1014 13:21:46.612267 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612328 master-2 kubenswrapper[4762]: I1014 13:21:46.612293 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.612400 master-2 kubenswrapper[4762]: I1014 13:21:46.612324 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612400 master-2 kubenswrapper[4762]: I1014 13:21:46.612349 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/269169cd-d1e1-47b2-926e-ef8c684424bb-monitoring-plugin-cert\") pod \"monitoring-plugin-75bcf9f5fd-5f2qh\" (UID: \"269169cd-d1e1-47b2-926e-ef8c684424bb\") " pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:46.612400 master-2 kubenswrapper[4762]: I1014 13:21:46.612372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-trusted-ca\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.612400 master-2 kubenswrapper[4762]: I1014 13:21:46.612399 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612422 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-config\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b14dba7b-829d-48e2-a0bb-9eef2303a088-serviceca\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612466 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612490 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612513 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.612552 master-2 kubenswrapper[4762]: I1014 13:21:46.612536 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745m6\" (UniqueName: \"kubernetes.io/projected/78c69543-957a-4d52-b52f-08bc11cf993c-kube-api-access-745m6\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-client-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612637 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612662 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-server-tls\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-client-certs\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612733 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612755 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqq6\" (UniqueName: \"kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612775 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rf5\" (UniqueName: \"kubernetes.io/projected/b14dba7b-829d-48e2-a0bb-9eef2303a088-kube-api-access-h5rf5\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612797 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th76q\" (UniqueName: \"kubernetes.io/projected/7d29d094-ce27-46ec-a556-0129526c1103-kube-api-access-th76q\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.612815 master-2 kubenswrapper[4762]: I1014 13:21:46.612818 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612840 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/632a0df2-e17d-483d-8a41-914ac73e0782-audit-log\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612868 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612936 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-networking-console-plugin-cert\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612962 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.612989 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk5d\" (UniqueName: \"kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.613011 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d29d094-ce27-46ec-a556-0129526c1103-serving-cert\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.613035 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78c69543-957a-4d52-b52f-08bc11cf993c-webhook-certs\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.613059 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.613137 master-2 kubenswrapper[4762]: I1014 13:21:46.613083 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2sc5\" (UniqueName: \"kubernetes.io/projected/632a0df2-e17d-483d-8a41-914ac73e0782-kube-api-access-j2sc5\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.613896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-nginx-conf\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614196 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614512 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614537 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b14dba7b-829d-48e2-a0bb-9eef2303a088-host\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614582 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-metrics-server-audit-profiles\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.614700 master-2 kubenswrapper[4762]: I1014 13:21:46.614618 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.614953 master-2 kubenswrapper[4762]: I1014 13:21:46.614743 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.615069 master-2 kubenswrapper[4762]: I1014 13:21:46.615034 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.615681 master-2 kubenswrapper[4762]: I1014 13:21:46.615632 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.617539 master-2 kubenswrapper[4762]: I1014 13:21:46.617498 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.619631 master-2 kubenswrapper[4762]: I1014 13:21:46.619591 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.636083 master-2 kubenswrapper[4762]: I1014 13:21:46.636038 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjvk\" (UniqueName: \"kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk\") pod \"controller-manager-66975b7c4d-j962d\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.641338 master-2 kubenswrapper[4762]: I1014 13:21:46.641290 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk5d\" (UniqueName: \"kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d\") pod \"route-controller-manager-76f4d8cd68-t98ml\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.667564 master-2 kubenswrapper[4762]: I1014 13:21:46.667504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:46.680370 master-2 kubenswrapper[4762]: I1014 13:21:46.680179 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716475 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-networking-console-plugin-cert\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716589 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d29d094-ce27-46ec-a556-0129526c1103-serving-cert\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716622 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78c69543-957a-4d52-b52f-08bc11cf993c-webhook-certs\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716655 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716689 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2sc5\" (UniqueName: \"kubernetes.io/projected/632a0df2-e17d-483d-8a41-914ac73e0782-kube-api-access-j2sc5\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716723 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-nginx-conf\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716755 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716793 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716859 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b14dba7b-829d-48e2-a0bb-9eef2303a088-host\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.716896 master-2 kubenswrapper[4762]: I1014 13:21:46.716891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-metrics-server-audit-profiles\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.716934 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.716969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717002 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/269169cd-d1e1-47b2-926e-ef8c684424bb-monitoring-plugin-cert\") pod \"monitoring-plugin-75bcf9f5fd-5f2qh\" (UID: \"269169cd-d1e1-47b2-926e-ef8c684424bb\") " pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717035 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-trusted-ca\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717074 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-config\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717104 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b14dba7b-829d-48e2-a0bb-9eef2303a088-serviceca\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717140 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717153 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745m6\" (UniqueName: \"kubernetes.io/projected/78c69543-957a-4d52-b52f-08bc11cf993c-kube-api-access-745m6\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-client-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717270 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717363 master-2 kubenswrapper[4762]: I1014 13:21:46.717341 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717373 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-server-tls\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-client-certs\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717438 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717469 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqq6\" (UniqueName: \"kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5rf5\" (UniqueName: \"kubernetes.io/projected/b14dba7b-829d-48e2-a0bb-9eef2303a088-kube-api-access-h5rf5\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th76q\" (UniqueName: \"kubernetes.io/projected/7d29d094-ce27-46ec-a556-0129526c1103-kube-api-access-th76q\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717564 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.717790 master-2 kubenswrapper[4762]: I1014 13:21:46.717596 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/632a0df2-e17d-483d-8a41-914ac73e0782-audit-log\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.718101 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b14dba7b-829d-48e2-a0bb-9eef2303a088-host\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.718551 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/632a0df2-e17d-483d-8a41-914ac73e0782-audit-log\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.718758 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719050 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-nginx-conf\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719099 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719511 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-config\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719546 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b14dba7b-829d-48e2-a0bb-9eef2303a088-serviceca\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719635 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.720007 master-2 kubenswrapper[4762]: I1014 13:21:46.719915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-metrics-server-audit-profiles\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.722217 master-2 kubenswrapper[4762]: I1014 13:21:46.720729 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d29d094-ce27-46ec-a556-0129526c1103-trusted-ca\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.722217 master-2 kubenswrapper[4762]: I1014 13:21:46.721092 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1be2e9d5-ef1f-4357-a5b5-88bc00663a0b-networking-console-plugin-cert\") pod \"networking-console-plugin-85df6bdd68-2dd2d\" (UID: \"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b\") " pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.722217 master-2 kubenswrapper[4762]: I1014 13:21:46.721213 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.722217 master-2 kubenswrapper[4762]: I1014 13:21:46.722001 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.722841 master-2 kubenswrapper[4762]: I1014 13:21:46.722802 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/632a0df2-e17d-483d-8a41-914ac73e0782-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.724699 master-2 kubenswrapper[4762]: I1014 13:21:46.724669 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.724768 master-2 kubenswrapper[4762]: I1014 13:21:46.724683 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.725187 master-2 kubenswrapper[4762]: I1014 13:21:46.725145 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.725261 master-2 kubenswrapper[4762]: I1014 13:21:46.725222 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-client-ca-bundle\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.725367 master-2 kubenswrapper[4762]: I1014 13:21:46.725331 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.725478 master-2 kubenswrapper[4762]: I1014 13:21:46.725450 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78c69543-957a-4d52-b52f-08bc11cf993c-webhook-certs\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.726828 master-2 kubenswrapper[4762]: I1014 13:21:46.726703 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-client-certs\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.728018 master-2 kubenswrapper[4762]: I1014 13:21:46.727972 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.728234 master-2 kubenswrapper[4762]: I1014 13:21:46.728204 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/632a0df2-e17d-483d-8a41-914ac73e0782-secret-metrics-server-tls\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.728334 master-2 kubenswrapper[4762]: I1014 13:21:46.728294 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.728792 master-2 kubenswrapper[4762]: I1014 13:21:46.728766 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d29d094-ce27-46ec-a556-0129526c1103-serving-cert\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.730032 master-2 kubenswrapper[4762]: I1014 13:21:46.729987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/269169cd-d1e1-47b2-926e-ef8c684424bb-monitoring-plugin-cert\") pod \"monitoring-plugin-75bcf9f5fd-5f2qh\" (UID: \"269169cd-d1e1-47b2-926e-ef8c684424bb\") " pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:46.738118 master-2 kubenswrapper[4762]: I1014 13:21:46.738042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745m6\" (UniqueName: \"kubernetes.io/projected/78c69543-957a-4d52-b52f-08bc11cf993c-kube-api-access-745m6\") pod \"multus-admission-controller-6bc7c56dc6-n46rr\" (UID: \"78c69543-957a-4d52-b52f-08bc11cf993c\") " pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.745443 master-2 kubenswrapper[4762]: I1014 13:21:46.745076 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2sc5\" (UniqueName: \"kubernetes.io/projected/632a0df2-e17d-483d-8a41-914ac73e0782-kube-api-access-j2sc5\") pod \"metrics-server-76c4979bdc-gds6w\" (UID: \"632a0df2-e17d-483d-8a41-914ac73e0782\") " pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.751419 master-2 kubenswrapper[4762]: I1014 13:21:46.751366 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqq6\" (UniqueName: \"kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6\") pod \"oauth-openshift-55df5b4c9d-k6sz4\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.752883 master-2 kubenswrapper[4762]: I1014 13:21:46.752843 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th76q\" (UniqueName: \"kubernetes.io/projected/7d29d094-ce27-46ec-a556-0129526c1103-kube-api-access-th76q\") pod \"console-operator-6768b5f5f9-6l8p6\" (UID: \"7d29d094-ce27-46ec-a556-0129526c1103\") " pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.754602 master-2 kubenswrapper[4762]: I1014 13:21:46.754563 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5rf5\" (UniqueName: \"kubernetes.io/projected/b14dba7b-829d-48e2-a0bb-9eef2303a088-kube-api-access-h5rf5\") pod \"node-ca-8fg56\" (UID: \"b14dba7b-829d-48e2-a0bb-9eef2303a088\") " pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.800211 master-2 kubenswrapper[4762]: I1014 13:21:46.800135 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:46.823525 master-2 kubenswrapper[4762]: I1014 13:21:46.823459 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:46.828262 master-2 kubenswrapper[4762]: I1014 13:21:46.828224 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:46.836618 master-2 kubenswrapper[4762]: I1014 13:21:46.832563 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:46.846369 master-2 kubenswrapper[4762]: I1014 13:21:46.846335 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" Oct 14 13:21:46.869739 master-2 kubenswrapper[4762]: I1014 13:21:46.864096 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" Oct 14 13:21:46.869739 master-2 kubenswrapper[4762]: I1014 13:21:46.869335 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8fg56" Oct 14 13:21:46.903888 master-2 kubenswrapper[4762]: I1014 13:21:46.903510 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:21:47.028107 master-2 kubenswrapper[4762]: I1014 13:21:47.028060 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:21:47.034319 master-2 kubenswrapper[4762]: W1014 13:21:47.034270 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf155e80_7f1a_4919_b7a9_5df5cbb92c27.slice/crio-0d8df32e5d37443da680373c58a291c75e438acf1328cfa8131c4168ef686ed5 WatchSource:0}: Error finding container 0d8df32e5d37443da680373c58a291c75e438acf1328cfa8131c4168ef686ed5: Status 404 returned error can't find the container with id 0d8df32e5d37443da680373c58a291c75e438acf1328cfa8131c4168ef686ed5 Oct 14 13:21:47.080821 master-2 kubenswrapper[4762]: I1014 13:21:47.080751 4762 patch_prober.go:28] interesting pod/apiserver-595d5f74d8-ttb94 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" start-of-body= Oct 14 13:21:47.080886 master-2 kubenswrapper[4762]: I1014 13:21:47.080808 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.52:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.52:8443: connect: connection refused" Oct 14 13:21:47.087880 master-2 kubenswrapper[4762]: I1014 13:21:47.087805 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:21:47.100117 master-2 kubenswrapper[4762]: I1014 13:21:47.100071 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:21:47.102444 master-2 kubenswrapper[4762]: W1014 13:21:47.102405 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058b0ff2_1e70_4446_a498_f94548dfb60f.slice/crio-93aeb62040e784adfc31664e25e9fc5f0b5e71bef30624b7ca84a50c711033d8 WatchSource:0}: Error finding container 93aeb62040e784adfc31664e25e9fc5f0b5e71bef30624b7ca84a50c711033d8: Status 404 returned error can't find the container with id 93aeb62040e784adfc31664e25e9fc5f0b5e71bef30624b7ca84a50c711033d8 Oct 14 13:21:47.195980 master-2 kubenswrapper[4762]: I1014 13:21:47.195940 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-76c4979bdc-gds6w"] Oct 14 13:21:47.227970 master-2 kubenswrapper[4762]: W1014 13:21:47.227925 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632a0df2_e17d_483d_8a41_914ac73e0782.slice/crio-00d7ad813c6753c474c71e700f863ab9992ff12595f3fa3a73cfac3aa3564eb9 WatchSource:0}: Error finding container 00d7ad813c6753c474c71e700f863ab9992ff12595f3fa3a73cfac3aa3564eb9: Status 404 returned error can't find the container with id 00d7ad813c6753c474c71e700f863ab9992ff12595f3fa3a73cfac3aa3564eb9 Oct 14 13:21:47.272940 master-2 kubenswrapper[4762]: I1014 13:21:47.272904 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6768b5f5f9-6l8p6"] Oct 14 13:21:47.281562 master-2 kubenswrapper[4762]: I1014 13:21:47.281502 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:21:47.295361 master-2 kubenswrapper[4762]: W1014 13:21:47.295291 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdfcfc2b_8cf3_41c7_a3ca_7482998d0e0e.slice/crio-406aef131c9b7b329fc3a9d41d454194f2f968d298f15a5fceaa019e9656f036 WatchSource:0}: Error finding container 406aef131c9b7b329fc3a9d41d454194f2f968d298f15a5fceaa019e9656f036: Status 404 returned error can't find the container with id 406aef131c9b7b329fc3a9d41d454194f2f968d298f15a5fceaa019e9656f036 Oct 14 13:21:47.366902 master-2 kubenswrapper[4762]: I1014 13:21:47.366705 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh"] Oct 14 13:21:47.369851 master-2 kubenswrapper[4762]: I1014 13:21:47.368644 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr"] Oct 14 13:21:47.378477 master-2 kubenswrapper[4762]: I1014 13:21:47.378425 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d"] Oct 14 13:21:47.814379 master-2 kubenswrapper[4762]: I1014 13:21:47.814319 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" event={"ID":"78c69543-957a-4d52-b52f-08bc11cf993c","Type":"ContainerStarted","Data":"fb71e0732ac157e68b55e9bd57d126fd1ae0a974f5fb47798de414b3cbbbb7fa"} Oct 14 13:21:47.814379 master-2 kubenswrapper[4762]: I1014 13:21:47.814373 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" event={"ID":"78c69543-957a-4d52-b52f-08bc11cf993c","Type":"ContainerStarted","Data":"2beb48f43cac3d821aa97719f7f1578a7996007efe595add76d694a6e2501c2e"} Oct 14 13:21:47.815263 master-2 kubenswrapper[4762]: I1014 13:21:47.815227 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" event={"ID":"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b","Type":"ContainerStarted","Data":"99ae88cd261ddb6e19457e8cfe807b135f8a0c462673a02b23217b652f8a8ae3"} Oct 14 13:21:47.816599 master-2 kubenswrapper[4762]: I1014 13:21:47.816566 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" event={"ID":"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7","Type":"ContainerStarted","Data":"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9"} Oct 14 13:21:47.816663 master-2 kubenswrapper[4762]: I1014 13:21:47.816597 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" event={"ID":"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7","Type":"ContainerStarted","Data":"ce42342ee02c3e802cfa620e22cf0e6bff4cada65294cfaf4906619687daec04"} Oct 14 13:21:47.817518 master-2 kubenswrapper[4762]: I1014 13:21:47.817461 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:47.819911 master-2 kubenswrapper[4762]: I1014 13:21:47.819880 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8fg56" event={"ID":"b14dba7b-829d-48e2-a0bb-9eef2303a088","Type":"ContainerStarted","Data":"e547b372164eb29f5b0c93b1f3aadba7941ea1fc05ae1839207be7179d8ee1ed"} Oct 14 13:21:47.820955 master-2 kubenswrapper[4762]: I1014 13:21:47.820931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" event={"ID":"269169cd-d1e1-47b2-926e-ef8c684424bb","Type":"ContainerStarted","Data":"663cc8b3872eedbd1fd17574aaa3083c7d3d3ae2a051c3e18bd3e05ad09cb290"} Oct 14 13:21:47.822323 master-2 kubenswrapper[4762]: I1014 13:21:47.822279 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" event={"ID":"7d29d094-ce27-46ec-a556-0129526c1103","Type":"ContainerStarted","Data":"bf369c41f78565d96691208490b3c60fb24488ff95f6e221fe6434cd2b6e34d1"} Oct 14 13:21:47.822598 master-2 kubenswrapper[4762]: I1014 13:21:47.822560 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:21:47.823996 master-2 kubenswrapper[4762]: I1014 13:21:47.823955 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" event={"ID":"632a0df2-e17d-483d-8a41-914ac73e0782","Type":"ContainerStarted","Data":"e3d713e3bb75165c5a849233cc39a5b6b0a2e3aadc92ffc75b78d41262dde409"} Oct 14 13:21:47.823996 master-2 kubenswrapper[4762]: I1014 13:21:47.823993 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" event={"ID":"632a0df2-e17d-483d-8a41-914ac73e0782","Type":"ContainerStarted","Data":"00d7ad813c6753c474c71e700f863ab9992ff12595f3fa3a73cfac3aa3564eb9"} Oct 14 13:21:47.824139 master-2 kubenswrapper[4762]: I1014 13:21:47.824110 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:21:47.825718 master-2 kubenswrapper[4762]: I1014 13:21:47.825680 4762 generic.go:334] "Generic (PLEG): container finished" podID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerID="eb606b021f2b7421df740b33c8a58df29792e7ef7cffd19b438bd855e5061ed9" exitCode=0 Oct 14 13:21:47.826008 master-2 kubenswrapper[4762]: I1014 13:21:47.825858 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" event={"ID":"df155e80-7f1a-4919-b7a9-5df5cbb92c27","Type":"ContainerDied","Data":"eb606b021f2b7421df740b33c8a58df29792e7ef7cffd19b438bd855e5061ed9"} Oct 14 13:21:47.826008 master-2 kubenswrapper[4762]: I1014 13:21:47.825885 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" event={"ID":"df155e80-7f1a-4919-b7a9-5df5cbb92c27","Type":"ContainerStarted","Data":"0d8df32e5d37443da680373c58a291c75e438acf1328cfa8131c4168ef686ed5"} Oct 14 13:21:47.827054 master-2 kubenswrapper[4762]: I1014 13:21:47.827015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" event={"ID":"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e","Type":"ContainerStarted","Data":"406aef131c9b7b329fc3a9d41d454194f2f968d298f15a5fceaa019e9656f036"} Oct 14 13:21:47.828719 master-2 kubenswrapper[4762]: I1014 13:21:47.828678 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" event={"ID":"058b0ff2-1e70-4446-a498-f94548dfb60f","Type":"ContainerStarted","Data":"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1"} Oct 14 13:21:47.828719 master-2 kubenswrapper[4762]: I1014 13:21:47.828713 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" event={"ID":"058b0ff2-1e70-4446-a498-f94548dfb60f","Type":"ContainerStarted","Data":"93aeb62040e784adfc31664e25e9fc5f0b5e71bef30624b7ca84a50c711033d8"} Oct 14 13:21:47.829081 master-2 kubenswrapper[4762]: I1014 13:21:47.829049 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:47.835564 master-2 kubenswrapper[4762]: I1014 13:21:47.835524 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:21:47.854122 master-2 kubenswrapper[4762]: I1014 13:21:47.854056 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" podStartSLOduration=100.8540128 podStartE2EDuration="1m40.8540128s" podCreationTimestamp="2025-10-14 13:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:47.845220771 +0000 UTC m=+937.089379940" watchObservedRunningTime="2025-10-14 13:21:47.8540128 +0000 UTC m=+937.098171969" Oct 14 13:21:47.908975 master-2 kubenswrapper[4762]: I1014 13:21:47.908850 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" podStartSLOduration=100.908831823 podStartE2EDuration="1m40.908831823s" podCreationTimestamp="2025-10-14 13:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:47.90718197 +0000 UTC m=+937.151341129" watchObservedRunningTime="2025-10-14 13:21:47.908831823 +0000 UTC m=+937.152990982" Oct 14 13:21:47.934666 master-2 kubenswrapper[4762]: I1014 13:21:47.932393 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" podStartSLOduration=419.932374891 podStartE2EDuration="6m59.932374891s" podCreationTimestamp="2025-10-14 13:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:47.930395308 +0000 UTC m=+937.174554467" watchObservedRunningTime="2025-10-14 13:21:47.932374891 +0000 UTC m=+937.176534050" Oct 14 13:21:48.840434 master-2 kubenswrapper[4762]: I1014 13:21:48.840358 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" event={"ID":"78c69543-957a-4d52-b52f-08bc11cf993c","Type":"ContainerStarted","Data":"92963b69bf553f3fad4800f5f5f7f274b90402d38a963091269e2a983ba74cd5"} Oct 14 13:21:48.844655 master-2 kubenswrapper[4762]: I1014 13:21:48.844614 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" event={"ID":"df155e80-7f1a-4919-b7a9-5df5cbb92c27","Type":"ContainerStarted","Data":"ba902ca859f0fcfe992aebe277dcc7b1ce0c63eee0caf6006314ea48d7bec6a3"} Oct 14 13:21:48.874592 master-2 kubenswrapper[4762]: I1014 13:21:48.874501 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-6bc7c56dc6-n46rr" podStartSLOduration=136.874479443 podStartE2EDuration="2m16.874479443s" podCreationTimestamp="2025-10-14 13:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:48.86997646 +0000 UTC m=+938.114135619" watchObservedRunningTime="2025-10-14 13:21:48.874479443 +0000 UTC m=+938.118638602" Oct 14 13:21:48.900306 master-2 kubenswrapper[4762]: I1014 13:21:48.900229 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podStartSLOduration=162.90021249 podStartE2EDuration="2m42.90021249s" podCreationTimestamp="2025-10-14 13:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:48.899577501 +0000 UTC m=+938.143736660" watchObservedRunningTime="2025-10-14 13:21:48.90021249 +0000 UTC m=+938.144371649" Oct 14 13:21:48.914423 master-2 kubenswrapper[4762]: I1014 13:21:48.913469 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:21:48.914423 master-2 kubenswrapper[4762]: I1014 13:21:48.913733 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="multus-admission-controller" containerID="cri-o://b4d790b1636493087694498f992b9121f3a37a50f6ab46979d6eb4f576d882ee" gracePeriod=30 Oct 14 13:21:48.914423 master-2 kubenswrapper[4762]: I1014 13:21:48.913853 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="kube-rbac-proxy" containerID="cri-o://c8dfd15a32985bd086d3b821b1137ea95df04ba46fa985bc8180bf62869ea7f9" gracePeriod=30 Oct 14 13:21:49.851527 master-2 kubenswrapper[4762]: I1014 13:21:49.851470 4762 generic.go:334] "Generic (PLEG): container finished" podID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerID="c8dfd15a32985bd086d3b821b1137ea95df04ba46fa985bc8180bf62869ea7f9" exitCode=0 Oct 14 13:21:49.851995 master-2 kubenswrapper[4762]: I1014 13:21:49.851530 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerDied","Data":"c8dfd15a32985bd086d3b821b1137ea95df04ba46fa985bc8180bf62869ea7f9"} Oct 14 13:21:50.863185 master-2 kubenswrapper[4762]: I1014 13:21:50.862505 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" event={"ID":"7d29d094-ce27-46ec-a556-0129526c1103","Type":"ContainerStarted","Data":"5662ef281738e442926ab4e18cd04d1ce5220be8c2993be557b22945e7dfbd57"} Oct 14 13:21:50.863185 master-2 kubenswrapper[4762]: I1014 13:21:50.863031 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:50.867207 master-2 kubenswrapper[4762]: I1014 13:21:50.865800 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" event={"ID":"1be2e9d5-ef1f-4357-a5b5-88bc00663a0b","Type":"ContainerStarted","Data":"19882b0b0085d8cd8023565c94b72c475aeb795fe5fc7eec1dfeca9666785691"} Oct 14 13:21:50.868120 master-2 kubenswrapper[4762]: I1014 13:21:50.868071 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8fg56" event={"ID":"b14dba7b-829d-48e2-a0bb-9eef2303a088","Type":"ContainerStarted","Data":"a6712211fa663f11cc06d2f09c27ec76fc938ea7a2a801ba362098b3d8615c32"} Oct 14 13:21:50.870707 master-2 kubenswrapper[4762]: I1014 13:21:50.870669 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" event={"ID":"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e","Type":"ContainerStarted","Data":"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede"} Oct 14 13:21:50.871257 master-2 kubenswrapper[4762]: I1014 13:21:50.871223 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:50.896997 master-2 kubenswrapper[4762]: I1014 13:21:50.896894 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" podStartSLOduration=185.730457083 podStartE2EDuration="3m8.896872038s" podCreationTimestamp="2025-10-14 13:18:42 +0000 UTC" firstStartedPulling="2025-10-14 13:21:47.278069226 +0000 UTC m=+936.522228385" lastFinishedPulling="2025-10-14 13:21:50.444484141 +0000 UTC m=+939.688643340" observedRunningTime="2025-10-14 13:21:50.892151008 +0000 UTC m=+940.136310197" watchObservedRunningTime="2025-10-14 13:21:50.896872038 +0000 UTC m=+940.141031207" Oct 14 13:21:50.917352 master-2 kubenswrapper[4762]: I1014 13:21:50.917264 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8fg56" podStartSLOduration=106.396212541 podStartE2EDuration="1m49.917235956s" podCreationTimestamp="2025-10-14 13:20:01 +0000 UTC" firstStartedPulling="2025-10-14 13:21:46.90345653 +0000 UTC m=+936.147615699" lastFinishedPulling="2025-10-14 13:21:50.424479915 +0000 UTC m=+939.668639114" observedRunningTime="2025-10-14 13:21:50.914104646 +0000 UTC m=+940.158263825" watchObservedRunningTime="2025-10-14 13:21:50.917235956 +0000 UTC m=+940.161395115" Oct 14 13:21:51.006390 master-2 kubenswrapper[4762]: I1014 13:21:51.004244 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-85df6bdd68-2dd2d" podStartSLOduration=107.00247054 podStartE2EDuration="1m50.00421209s" podCreationTimestamp="2025-10-14 13:20:01 +0000 UTC" firstStartedPulling="2025-10-14 13:21:47.396144669 +0000 UTC m=+936.640303828" lastFinishedPulling="2025-10-14 13:21:50.397886199 +0000 UTC m=+939.642045378" observedRunningTime="2025-10-14 13:21:50.942528389 +0000 UTC m=+940.186687548" watchObservedRunningTime="2025-10-14 13:21:51.00421209 +0000 UTC m=+940.248371239" Oct 14 13:21:51.014182 master-2 kubenswrapper[4762]: I1014 13:21:51.010647 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" podStartSLOduration=198.884676797 podStartE2EDuration="3m22.010613384s" podCreationTimestamp="2025-10-14 13:18:29 +0000 UTC" firstStartedPulling="2025-10-14 13:21:47.297968609 +0000 UTC m=+936.542127768" lastFinishedPulling="2025-10-14 13:21:50.423905186 +0000 UTC m=+939.668064355" observedRunningTime="2025-10-14 13:21:50.995595556 +0000 UTC m=+940.239754715" watchObservedRunningTime="2025-10-14 13:21:51.010613384 +0000 UTC m=+940.254772543" Oct 14 13:21:51.041221 master-2 kubenswrapper[4762]: I1014 13:21:51.041023 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:21:51.175646 master-2 kubenswrapper[4762]: I1014 13:21:51.175461 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6768b5f5f9-6l8p6" Oct 14 13:21:51.435605 master-2 kubenswrapper[4762]: I1014 13:21:51.435473 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-65bb9777fc-sd822"] Oct 14 13:21:51.436175 master-2 kubenswrapper[4762]: I1014 13:21:51.436128 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:21:51.438444 master-2 kubenswrapper[4762]: I1014 13:21:51.438401 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-tq8pv" Oct 14 13:21:51.438764 master-2 kubenswrapper[4762]: I1014 13:21:51.438730 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 14 13:21:51.438906 master-2 kubenswrapper[4762]: I1014 13:21:51.438878 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 14 13:21:51.459348 master-2 kubenswrapper[4762]: I1014 13:21:51.459291 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-65bb9777fc-sd822"] Oct 14 13:21:51.594929 master-2 kubenswrapper[4762]: I1014 13:21:51.594862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xbt\" (UniqueName: \"kubernetes.io/projected/09d92233-a8b3-458a-8c27-f62e982a9d90-kube-api-access-78xbt\") pod \"downloads-65bb9777fc-sd822\" (UID: \"09d92233-a8b3-458a-8c27-f62e982a9d90\") " pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:21:51.608269 master-2 kubenswrapper[4762]: I1014 13:21:51.608217 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:51.608269 master-2 kubenswrapper[4762]: I1014 13:21:51.608267 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:51.620947 master-2 kubenswrapper[4762]: I1014 13:21:51.620908 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:51.696391 master-2 kubenswrapper[4762]: I1014 13:21:51.696251 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xbt\" (UniqueName: \"kubernetes.io/projected/09d92233-a8b3-458a-8c27-f62e982a9d90-kube-api-access-78xbt\") pod \"downloads-65bb9777fc-sd822\" (UID: \"09d92233-a8b3-458a-8c27-f62e982a9d90\") " pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:21:51.718289 master-2 kubenswrapper[4762]: I1014 13:21:51.718247 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xbt\" (UniqueName: \"kubernetes.io/projected/09d92233-a8b3-458a-8c27-f62e982a9d90-kube-api-access-78xbt\") pod \"downloads-65bb9777fc-sd822\" (UID: \"09d92233-a8b3-458a-8c27-f62e982a9d90\") " pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:21:51.751253 master-2 kubenswrapper[4762]: I1014 13:21:51.750511 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:21:51.887122 master-2 kubenswrapper[4762]: I1014 13:21:51.887058 4762 generic.go:334] "Generic (PLEG): container finished" podID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerID="eb59a26421ed95409972305df3c5daa73b20d2bede001f3c9ed71c3c125f3dc5" exitCode=0 Oct 14 13:21:51.887567 master-2 kubenswrapper[4762]: I1014 13:21:51.887261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerDied","Data":"eb59a26421ed95409972305df3c5daa73b20d2bede001f3c9ed71c3c125f3dc5"} Oct 14 13:21:51.896000 master-2 kubenswrapper[4762]: I1014 13:21:51.895940 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:21:51.991900 master-2 kubenswrapper[4762]: I1014 13:21:51.991856 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:21:52.052253 master-2 kubenswrapper[4762]: I1014 13:21:52.052199 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:21:52.052436 master-2 kubenswrapper[4762]: E1014 13:21:52.052425 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" Oct 14 13:21:52.052502 master-2 kubenswrapper[4762]: I1014 13:21:52.052437 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" Oct 14 13:21:52.052502 master-2 kubenswrapper[4762]: E1014 13:21:52.052447 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver-check-endpoints" Oct 14 13:21:52.052502 master-2 kubenswrapper[4762]: I1014 13:21:52.052456 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver-check-endpoints" Oct 14 13:21:52.052502 master-2 kubenswrapper[4762]: E1014 13:21:52.052473 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="fix-audit-permissions" Oct 14 13:21:52.052502 master-2 kubenswrapper[4762]: I1014 13:21:52.052480 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="fix-audit-permissions" Oct 14 13:21:52.052695 master-2 kubenswrapper[4762]: I1014 13:21:52.052616 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver" Oct 14 13:21:52.052695 master-2 kubenswrapper[4762]: I1014 13:21:52.052636 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" containerName="openshift-apiserver-check-endpoints" Oct 14 13:21:52.053769 master-2 kubenswrapper[4762]: I1014 13:21:52.053721 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.074830 master-2 kubenswrapper[4762]: I1014 13:21:52.074788 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:21:52.101223 master-2 kubenswrapper[4762]: I1014 13:21:52.101177 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.101223 master-2 kubenswrapper[4762]: I1014 13:21:52.101224 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.101464 master-2 kubenswrapper[4762]: I1014 13:21:52.101279 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.101464 master-2 kubenswrapper[4762]: I1014 13:21:52.101300 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.101527 master-2 kubenswrapper[4762]: I1014 13:21:52.101455 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:21:52.101932 master-2 kubenswrapper[4762]: I1014 13:21:52.101885 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config" (OuterVolumeSpecName: "config") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:52.101992 master-2 kubenswrapper[4762]: I1014 13:21:52.101976 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:52.102130 master-2 kubenswrapper[4762]: I1014 13:21:52.102052 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102130 master-2 kubenswrapper[4762]: I1014 13:21:52.102078 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102130 master-2 kubenswrapper[4762]: I1014 13:21:52.102101 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:21:52.102130 master-2 kubenswrapper[4762]: I1014 13:21:52.102097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102316 master-2 kubenswrapper[4762]: I1014 13:21:52.102172 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102316 master-2 kubenswrapper[4762]: I1014 13:21:52.102209 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102316 master-2 kubenswrapper[4762]: I1014 13:21:52.102243 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdjv6\" (UniqueName: \"kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102316 master-2 kubenswrapper[4762]: I1014 13:21:52.102279 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle\") pod \"32e55f97-d971-46dd-b6b2-cdab1dc766df\" (UID: \"32e55f97-d971-46dd-b6b2-cdab1dc766df\") " Oct 14 13:21:52.102841 master-2 kubenswrapper[4762]: I1014 13:21:52.102611 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit" (OuterVolumeSpecName: "audit") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:52.102841 master-2 kubenswrapper[4762]: I1014 13:21:52.102815 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:52.103399 master-2 kubenswrapper[4762]: I1014 13:21:52.103041 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.103399 master-2 kubenswrapper[4762]: I1014 13:21:52.103061 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.103399 master-2 kubenswrapper[4762]: I1014 13:21:52.103089 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.103399 master-2 kubenswrapper[4762]: I1014 13:21:52.103115 4762 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/32e55f97-d971-46dd-b6b2-cdab1dc766df-node-pullsecrets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.103399 master-2 kubenswrapper[4762]: I1014 13:21:52.103135 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-audit\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.103580 master-2 kubenswrapper[4762]: I1014 13:21:52.103482 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:21:52.104522 master-2 kubenswrapper[4762]: I1014 13:21:52.104490 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:52.105358 master-2 kubenswrapper[4762]: I1014 13:21:52.105298 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:52.105460 master-2 kubenswrapper[4762]: I1014 13:21:52.105422 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:21:52.106649 master-2 kubenswrapper[4762]: I1014 13:21:52.106618 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6" (OuterVolumeSpecName: "kube-api-access-kdjv6") pod "32e55f97-d971-46dd-b6b2-cdab1dc766df" (UID: "32e55f97-d971-46dd-b6b2-cdab1dc766df"). InnerVolumeSpecName "kube-api-access-kdjv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:21:52.177340 master-2 kubenswrapper[4762]: I1014 13:21:52.177290 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-65bb9777fc-sd822"] Oct 14 13:21:52.204628 master-2 kubenswrapper[4762]: I1014 13:21:52.204577 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.204779 master-2 kubenswrapper[4762]: I1014 13:21:52.204654 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.204779 master-2 kubenswrapper[4762]: I1014 13:21:52.204687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.204779 master-2 kubenswrapper[4762]: I1014 13:21:52.204712 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.204779 master-2 kubenswrapper[4762]: I1014 13:21:52.204731 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205013 master-2 kubenswrapper[4762]: I1014 13:21:52.204942 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205076 master-2 kubenswrapper[4762]: I1014 13:21:52.205064 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205127 master-2 kubenswrapper[4762]: I1014 13:21:52.205102 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205186 master-2 kubenswrapper[4762]: I1014 13:21:52.205140 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4xc\" (UniqueName: \"kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205245 master-2 kubenswrapper[4762]: I1014 13:21:52.205208 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205294 master-2 kubenswrapper[4762]: I1014 13:21:52.205276 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.205419 master-2 kubenswrapper[4762]: I1014 13:21:52.205389 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdjv6\" (UniqueName: \"kubernetes.io/projected/32e55f97-d971-46dd-b6b2-cdab1dc766df-kube-api-access-kdjv6\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.205419 master-2 kubenswrapper[4762]: I1014 13:21:52.205410 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.205479 master-2 kubenswrapper[4762]: I1014 13:21:52.205424 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.205479 master-2 kubenswrapper[4762]: I1014 13:21:52.205440 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.205479 master-2 kubenswrapper[4762]: I1014 13:21:52.205453 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/32e55f97-d971-46dd-b6b2-cdab1dc766df-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.205479 master-2 kubenswrapper[4762]: I1014 13:21:52.205463 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/32e55f97-d971-46dd-b6b2-cdab1dc766df-image-import-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:21:52.305972 master-2 kubenswrapper[4762]: I1014 13:21:52.305857 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.305972 master-2 kubenswrapper[4762]: I1014 13:21:52.305915 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.305972 master-2 kubenswrapper[4762]: I1014 13:21:52.305949 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.305972 master-2 kubenswrapper[4762]: I1014 13:21:52.305975 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306003 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306027 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4xc\" (UniqueName: \"kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306043 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306088 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306347 master-2 kubenswrapper[4762]: I1014 13:21:52.306134 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306571 master-2 kubenswrapper[4762]: I1014 13:21:52.306356 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306847 master-2 kubenswrapper[4762]: I1014 13:21:52.306777 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.306910 master-2 kubenswrapper[4762]: I1014 13:21:52.306869 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.307896 master-2 kubenswrapper[4762]: I1014 13:21:52.307838 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.307970 master-2 kubenswrapper[4762]: I1014 13:21:52.307919 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.308282 master-2 kubenswrapper[4762]: I1014 13:21:52.308242 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.308602 master-2 kubenswrapper[4762]: I1014 13:21:52.308566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.310568 master-2 kubenswrapper[4762]: I1014 13:21:52.310535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.310857 master-2 kubenswrapper[4762]: I1014 13:21:52.310812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.312093 master-2 kubenswrapper[4762]: I1014 13:21:52.312045 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.337062 master-2 kubenswrapper[4762]: I1014 13:21:52.337024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4xc\" (UniqueName: \"kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc\") pod \"apiserver-5f68d4c887-s2fvb\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.367191 master-2 kubenswrapper[4762]: I1014 13:21:52.366994 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:52.627477 master-2 kubenswrapper[4762]: I1014 13:21:52.627306 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-2" Oct 14 13:21:52.769236 master-2 kubenswrapper[4762]: I1014 13:21:52.769020 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:21:52.898192 master-2 kubenswrapper[4762]: I1014 13:21:52.897884 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" event={"ID":"32e55f97-d971-46dd-b6b2-cdab1dc766df","Type":"ContainerDied","Data":"62be16fcf39ee2382a9e4c402c51b828b8370c4a3b58b7817b8a36cb87501988"} Oct 14 13:21:52.898192 master-2 kubenswrapper[4762]: I1014 13:21:52.897998 4762 scope.go:117] "RemoveContainer" containerID="475c7fe94c7689d429a499bdcf69b6cc227826fdedac1115b9592805f384a109" Oct 14 13:21:52.898192 master-2 kubenswrapper[4762]: I1014 13:21:52.898048 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-595d5f74d8-ttb94" Oct 14 13:21:52.901799 master-2 kubenswrapper[4762]: I1014 13:21:52.901377 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-65bb9777fc-sd822" event={"ID":"09d92233-a8b3-458a-8c27-f62e982a9d90","Type":"ContainerStarted","Data":"c35711fa1b0a4aec24a1a563da1a92153513e5ff75c54cf2ce69c0389d8bda48"} Oct 14 13:21:52.952372 master-2 kubenswrapper[4762]: I1014 13:21:52.952312 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:21:52.955630 master-2 kubenswrapper[4762]: I1014 13:21:52.955569 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-595d5f74d8-ttb94"] Oct 14 13:21:53.074490 master-2 kubenswrapper[4762]: I1014 13:21:53.074289 4762 scope.go:117] "RemoveContainer" containerID="eb59a26421ed95409972305df3c5daa73b20d2bede001f3c9ed71c3c125f3dc5" Oct 14 13:21:53.099553 master-2 kubenswrapper[4762]: I1014 13:21:53.099503 4762 scope.go:117] "RemoveContainer" containerID="38275d8be284d8541f9671a3e61d6f7cd701a29cf8a0e5b5642bff4e6f23d6c1" Oct 14 13:21:53.557871 master-2 kubenswrapper[4762]: I1014 13:21:53.557780 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e55f97-d971-46dd-b6b2-cdab1dc766df" path="/var/lib/kubelet/pods/32e55f97-d971-46dd-b6b2-cdab1dc766df/volumes" Oct 14 13:21:53.908583 master-2 kubenswrapper[4762]: I1014 13:21:53.908442 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" event={"ID":"269169cd-d1e1-47b2-926e-ef8c684424bb","Type":"ContainerStarted","Data":"1b83879cb457eabaf4405689dab6c061954d49c49ee3398e7c7ea9193758d864"} Oct 14 13:21:53.909149 master-2 kubenswrapper[4762]: I1014 13:21:53.908806 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:53.912306 master-2 kubenswrapper[4762]: I1014 13:21:53.911898 4762 generic.go:334] "Generic (PLEG): container finished" podID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerID="4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c" exitCode=0 Oct 14 13:21:53.912306 master-2 kubenswrapper[4762]: I1014 13:21:53.911932 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerDied","Data":"4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c"} Oct 14 13:21:53.912306 master-2 kubenswrapper[4762]: I1014 13:21:53.911951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerStarted","Data":"48bf4594f83f895ff84f955852c223ad410d2d6529063ce059e62b67cab1e8b4"} Oct 14 13:21:53.913274 master-2 kubenswrapper[4762]: I1014 13:21:53.913241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" Oct 14 13:21:53.933239 master-2 kubenswrapper[4762]: I1014 13:21:53.933147 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-75bcf9f5fd-5f2qh" podStartSLOduration=181.164873501 podStartE2EDuration="3m6.933129377s" podCreationTimestamp="2025-10-14 13:18:47 +0000 UTC" firstStartedPulling="2025-10-14 13:21:47.394672082 +0000 UTC m=+936.638831241" lastFinishedPulling="2025-10-14 13:21:53.162927968 +0000 UTC m=+942.407087117" observedRunningTime="2025-10-14 13:21:53.931197975 +0000 UTC m=+943.175357174" watchObservedRunningTime="2025-10-14 13:21:53.933129377 +0000 UTC m=+943.177288536" Oct 14 13:21:54.306877 master-2 kubenswrapper[4762]: I1014 13:21:54.306801 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:21:54.310143 master-2 kubenswrapper[4762]: I1014 13:21:54.310110 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.315744 master-2 kubenswrapper[4762]: I1014 13:21:54.315696 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 14 13:21:54.315946 master-2 kubenswrapper[4762]: I1014 13:21:54.315927 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-r2r7j" Oct 14 13:21:54.316125 master-2 kubenswrapper[4762]: I1014 13:21:54.316103 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 14 13:21:54.320337 master-2 kubenswrapper[4762]: I1014 13:21:54.319054 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:21:54.320337 master-2 kubenswrapper[4762]: I1014 13:21:54.319413 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 14 13:21:54.320337 master-2 kubenswrapper[4762]: I1014 13:21:54.319624 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 14 13:21:54.323670 master-2 kubenswrapper[4762]: I1014 13:21:54.323589 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 14 13:21:54.434062 master-2 kubenswrapper[4762]: I1014 13:21:54.433608 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.434062 master-2 kubenswrapper[4762]: I1014 13:21:54.433673 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.434062 master-2 kubenswrapper[4762]: I1014 13:21:54.434024 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.434321 master-2 kubenswrapper[4762]: I1014 13:21:54.434203 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.434321 master-2 kubenswrapper[4762]: I1014 13:21:54.434249 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mn89\" (UniqueName: \"kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.434743 master-2 kubenswrapper[4762]: I1014 13:21:54.434485 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.535939 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.535990 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.536019 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.536045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.536065 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mn89\" (UniqueName: \"kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.536088 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.537091 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.537583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.539177 master-2 kubenswrapper[4762]: I1014 13:21:54.538125 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.540374 master-2 kubenswrapper[4762]: I1014 13:21:54.539769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.541009 master-2 kubenswrapper[4762]: I1014 13:21:54.540869 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.571200 master-2 kubenswrapper[4762]: I1014 13:21:54.570120 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mn89\" (UniqueName: \"kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89\") pod \"console-668956f9dd-llkhv\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.631930 master-2 kubenswrapper[4762]: I1014 13:21:54.631836 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:21:54.923059 master-2 kubenswrapper[4762]: I1014 13:21:54.922987 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerStarted","Data":"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a"} Oct 14 13:21:54.923059 master-2 kubenswrapper[4762]: I1014 13:21:54.923048 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerStarted","Data":"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d"} Oct 14 13:21:54.959920 master-2 kubenswrapper[4762]: I1014 13:21:54.959818 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podStartSLOduration=113.959790076 podStartE2EDuration="1m53.959790076s" podCreationTimestamp="2025-10-14 13:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:21:54.955410057 +0000 UTC m=+944.199569216" watchObservedRunningTime="2025-10-14 13:21:54.959790076 +0000 UTC m=+944.203949255" Oct 14 13:21:55.049397 master-2 kubenswrapper[4762]: I1014 13:21:55.049291 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:21:55.979151 master-2 kubenswrapper[4762]: I1014 13:21:55.979016 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668956f9dd-llkhv" event={"ID":"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3","Type":"ContainerStarted","Data":"9c26b92861f7ceb2c1298738b1da256332fd54464d0a2e6cf4ba71ee617daf3e"} Oct 14 13:21:57.369048 master-2 kubenswrapper[4762]: I1014 13:21:57.367694 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:57.369048 master-2 kubenswrapper[4762]: I1014 13:21:57.367755 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:57.380257 master-2 kubenswrapper[4762]: I1014 13:21:57.379458 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:58.001791 master-2 kubenswrapper[4762]: I1014 13:21:58.001743 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:21:58.866092 master-2 kubenswrapper[4762]: I1014 13:21:58.866041 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:21:58.873303 master-2 kubenswrapper[4762]: I1014 13:21:58.872405 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.874672 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.875281 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.875701 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.875739 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.875742 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-z89cl" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.876142 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Oct 14 13:21:58.877052 master-2 kubenswrapper[4762]: I1014 13:21:58.876428 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Oct 14 13:21:58.882790 master-2 kubenswrapper[4762]: I1014 13:21:58.881802 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Oct 14 13:21:58.897624 master-2 kubenswrapper[4762]: I1014 13:21:58.897541 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Oct 14 13:21:58.902140 master-2 kubenswrapper[4762]: I1014 13:21:58.902099 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:21:58.972131 master-2 kubenswrapper[4762]: I1014 13:21:58.972080 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.972444 master-2 kubenswrapper[4762]: I1014 13:21:58.972421 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.972568 master-2 kubenswrapper[4762]: I1014 13:21:58.972550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.972690 master-2 kubenswrapper[4762]: I1014 13:21:58.972672 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.972816 master-2 kubenswrapper[4762]: I1014 13:21:58.972798 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.972937 master-2 kubenswrapper[4762]: I1014 13:21:58.972920 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973051 master-2 kubenswrapper[4762]: I1014 13:21:58.973034 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973213 master-2 kubenswrapper[4762]: I1014 13:21:58.973195 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69qvh\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973398 master-2 kubenswrapper[4762]: I1014 13:21:58.973380 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973520 master-2 kubenswrapper[4762]: I1014 13:21:58.973502 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973632 master-2 kubenswrapper[4762]: I1014 13:21:58.973616 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:58.973741 master-2 kubenswrapper[4762]: I1014 13:21:58.973726 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075149 master-2 kubenswrapper[4762]: I1014 13:21:59.075086 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075149 master-2 kubenswrapper[4762]: I1014 13:21:59.075144 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075216 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075258 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075295 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075395 master-2 kubenswrapper[4762]: I1014 13:21:59.075392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69qvh\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075574 master-2 kubenswrapper[4762]: I1014 13:21:59.075414 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075574 master-2 kubenswrapper[4762]: I1014 13:21:59.075463 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075574 master-2 kubenswrapper[4762]: I1014 13:21:59.075485 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075574 master-2 kubenswrapper[4762]: I1014 13:21:59.075521 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.075890 master-2 kubenswrapper[4762]: I1014 13:21:59.075845 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.076725 master-2 kubenswrapper[4762]: I1014 13:21:59.076706 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.078957 master-2 kubenswrapper[4762]: I1014 13:21:59.078912 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.080235 master-2 kubenswrapper[4762]: I1014 13:21:59.080188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.080572 master-2 kubenswrapper[4762]: I1014 13:21:59.080346 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.080572 master-2 kubenswrapper[4762]: I1014 13:21:59.080535 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.080664 master-2 kubenswrapper[4762]: I1014 13:21:59.080624 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.080876 master-2 kubenswrapper[4762]: I1014 13:21:59.080840 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.082460 master-2 kubenswrapper[4762]: I1014 13:21:59.082422 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.082845 master-2 kubenswrapper[4762]: I1014 13:21:59.082810 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.083180 master-2 kubenswrapper[4762]: I1014 13:21:59.083122 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.098092 master-2 kubenswrapper[4762]: I1014 13:21:59.098023 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69qvh\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh\") pod \"alertmanager-main-0\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.182582 master-2 kubenswrapper[4762]: I1014 13:21:59.182523 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:21:59.202713 master-2 kubenswrapper[4762]: I1014 13:21:59.202661 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:21:59.670287 master-2 kubenswrapper[4762]: I1014 13:21:59.670249 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:21:59.675412 master-2 kubenswrapper[4762]: W1014 13:21:59.675362 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a1ec2c_6bca_4616_8c89_060a6d1593e6.slice/crio-90b247384c26b9abe62364552eb19ccbefce9f75b9a58d7f5d2f77d11d6ab6ca WatchSource:0}: Error finding container 90b247384c26b9abe62364552eb19ccbefce9f75b9a58d7f5d2f77d11d6ab6ca: Status 404 returned error can't find the container with id 90b247384c26b9abe62364552eb19ccbefce9f75b9a58d7f5d2f77d11d6ab6ca Oct 14 13:21:59.882216 master-2 kubenswrapper[4762]: I1014 13:21:59.882125 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-cc99494f6-kmmxc"] Oct 14 13:21:59.884827 master-2 kubenswrapper[4762]: I1014 13:21:59.884793 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:21:59.887896 master-2 kubenswrapper[4762]: I1014 13:21:59.887847 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Oct 14 13:21:59.888067 master-2 kubenswrapper[4762]: I1014 13:21:59.887920 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-8otna1nr4bh0o" Oct 14 13:21:59.888141 master-2 kubenswrapper[4762]: I1014 13:21:59.888118 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Oct 14 13:21:59.888239 master-2 kubenswrapper[4762]: I1014 13:21:59.888130 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-zf6rs" Oct 14 13:21:59.888388 master-2 kubenswrapper[4762]: I1014 13:21:59.888356 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Oct 14 13:21:59.888388 master-2 kubenswrapper[4762]: I1014 13:21:59.888389 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Oct 14 13:21:59.888560 master-2 kubenswrapper[4762]: I1014 13:21:59.888531 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Oct 14 13:21:59.904039 master-2 kubenswrapper[4762]: I1014 13:21:59.902868 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cc99494f6-kmmxc"] Oct 14 13:22:00.001859 master-2 kubenswrapper[4762]: I1014 13:22:00.001811 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12625659-53e4-4ae4-837b-f5178cfe2681-metrics-client-ca\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002088 master-2 kubenswrapper[4762]: I1014 13:22:00.001882 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002088 master-2 kubenswrapper[4762]: I1014 13:22:00.001907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-grpc-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002088 master-2 kubenswrapper[4762]: I1014 13:22:00.002044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpckq\" (UniqueName: \"kubernetes.io/projected/12625659-53e4-4ae4-837b-f5178cfe2681-kube-api-access-fpckq\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002249 master-2 kubenswrapper[4762]: I1014 13:22:00.002114 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002249 master-2 kubenswrapper[4762]: I1014 13:22:00.002186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002333 master-2 kubenswrapper[4762]: I1014 13:22:00.002258 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.002333 master-2 kubenswrapper[4762]: I1014 13:22:00.002302 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.009237 master-2 kubenswrapper[4762]: I1014 13:22:00.009192 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668956f9dd-llkhv" event={"ID":"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3","Type":"ContainerStarted","Data":"7129000b66a3dca2c1c6b420d9199bde8dc73c1692662b1ef971cd750d195b34"} Oct 14 13:22:00.012190 master-2 kubenswrapper[4762]: I1014 13:22:00.012129 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5" exitCode=0 Oct 14 13:22:00.012610 master-2 kubenswrapper[4762]: I1014 13:22:00.012203 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5"} Oct 14 13:22:00.012610 master-2 kubenswrapper[4762]: I1014 13:22:00.012238 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"90b247384c26b9abe62364552eb19ccbefce9f75b9a58d7f5d2f77d11d6ab6ca"} Oct 14 13:22:00.034090 master-2 kubenswrapper[4762]: I1014 13:22:00.033741 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-668956f9dd-llkhv" podStartSLOduration=1.836885061 podStartE2EDuration="6.033720195s" podCreationTimestamp="2025-10-14 13:21:54 +0000 UTC" firstStartedPulling="2025-10-14 13:21:55.056919152 +0000 UTC m=+944.301078311" lastFinishedPulling="2025-10-14 13:21:59.253754286 +0000 UTC m=+948.497913445" observedRunningTime="2025-10-14 13:22:00.032045622 +0000 UTC m=+949.276204791" watchObservedRunningTime="2025-10-14 13:22:00.033720195 +0000 UTC m=+949.277879354" Oct 14 13:22:00.103378 master-2 kubenswrapper[4762]: I1014 13:22:00.103320 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103580 master-2 kubenswrapper[4762]: I1014 13:22:00.103385 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103580 master-2 kubenswrapper[4762]: I1014 13:22:00.103483 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12625659-53e4-4ae4-837b-f5178cfe2681-metrics-client-ca\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103580 master-2 kubenswrapper[4762]: I1014 13:22:00.103543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103580 master-2 kubenswrapper[4762]: I1014 13:22:00.103565 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-grpc-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103710 master-2 kubenswrapper[4762]: I1014 13:22:00.103592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpckq\" (UniqueName: \"kubernetes.io/projected/12625659-53e4-4ae4-837b-f5178cfe2681-kube-api-access-fpckq\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103710 master-2 kubenswrapper[4762]: I1014 13:22:00.103618 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.103710 master-2 kubenswrapper[4762]: I1014 13:22:00.103662 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.106046 master-2 kubenswrapper[4762]: I1014 13:22:00.105587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12625659-53e4-4ae4-837b-f5178cfe2681-metrics-client-ca\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.107512 master-2 kubenswrapper[4762]: I1014 13:22:00.107477 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.107918 master-2 kubenswrapper[4762]: I1014 13:22:00.107881 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-grpc-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.108410 master-2 kubenswrapper[4762]: I1014 13:22:00.108301 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.108893 master-2 kubenswrapper[4762]: I1014 13:22:00.108855 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.109310 master-2 kubenswrapper[4762]: I1014 13:22:00.109273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-tls\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.109934 master-2 kubenswrapper[4762]: I1014 13:22:00.109902 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12625659-53e4-4ae4-837b-f5178cfe2681-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.126895 master-2 kubenswrapper[4762]: I1014 13:22:00.126679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpckq\" (UniqueName: \"kubernetes.io/projected/12625659-53e4-4ae4-837b-f5178cfe2681-kube-api-access-fpckq\") pod \"thanos-querier-cc99494f6-kmmxc\" (UID: \"12625659-53e4-4ae4-837b-f5178cfe2681\") " pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.236770 master-2 kubenswrapper[4762]: I1014 13:22:00.236664 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:00.647099 master-2 kubenswrapper[4762]: I1014 13:22:00.647028 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cc99494f6-kmmxc"] Oct 14 13:22:00.652828 master-2 kubenswrapper[4762]: W1014 13:22:00.652757 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12625659_53e4_4ae4_837b_f5178cfe2681.slice/crio-fd0da005fc6e23e16bc349c51541b66ed8a70d8161f8f8121a2676ed95ecd962 WatchSource:0}: Error finding container fd0da005fc6e23e16bc349c51541b66ed8a70d8161f8f8121a2676ed95ecd962: Status 404 returned error can't find the container with id fd0da005fc6e23e16bc349c51541b66ed8a70d8161f8f8121a2676ed95ecd962 Oct 14 13:22:01.027595 master-2 kubenswrapper[4762]: I1014 13:22:01.027538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"fd0da005fc6e23e16bc349c51541b66ed8a70d8161f8f8121a2676ed95ecd962"} Oct 14 13:22:01.281399 master-2 kubenswrapper[4762]: I1014 13:22:01.281259 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-2"] Oct 14 13:22:01.282489 master-2 kubenswrapper[4762]: I1014 13:22:01.282442 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.285615 master-2 kubenswrapper[4762]: I1014 13:22:01.285559 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bm6wx" Oct 14 13:22:01.285827 master-2 kubenswrapper[4762]: I1014 13:22:01.285782 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-2"] Oct 14 13:22:01.424102 master-2 kubenswrapper[4762]: I1014 13:22:01.423941 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.424102 master-2 kubenswrapper[4762]: I1014 13:22:01.423992 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.525100 master-2 kubenswrapper[4762]: I1014 13:22:01.525045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.525100 master-2 kubenswrapper[4762]: I1014 13:22:01.525101 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.525360 master-2 kubenswrapper[4762]: I1014 13:22:01.525234 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.546887 master-2 kubenswrapper[4762]: I1014 13:22:01.546774 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:01.614307 master-2 kubenswrapper[4762]: I1014 13:22:01.614239 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:02.337369 master-2 kubenswrapper[4762]: I1014 13:22:02.337313 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-2"] Oct 14 13:22:03.047320 master-2 kubenswrapper[4762]: I1014 13:22:03.047263 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066"} Oct 14 13:22:03.047320 master-2 kubenswrapper[4762]: I1014 13:22:03.047311 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653"} Oct 14 13:22:04.059808 master-2 kubenswrapper[4762]: I1014 13:22:04.059649 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c"} Oct 14 13:22:04.059808 master-2 kubenswrapper[4762]: I1014 13:22:04.059692 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10"} Oct 14 13:22:04.061485 master-2 kubenswrapper[4762]: I1014 13:22:04.061410 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-2" event={"ID":"eddc1afe-8608-4390-89a1-2cf9b4e772ee","Type":"ContainerStarted","Data":"f1cec86cee3d6c41f35e42c4b86169486166829e3b5aed6017e8691cde08c9dc"} Oct 14 13:22:04.061485 master-2 kubenswrapper[4762]: I1014 13:22:04.061459 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-2" event={"ID":"eddc1afe-8608-4390-89a1-2cf9b4e772ee","Type":"ContainerStarted","Data":"7f512a7298dc24097ca3a8943ec576c0d358368dbae66d16318abeacb8135c10"} Oct 14 13:22:04.633463 master-2 kubenswrapper[4762]: I1014 13:22:04.633312 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:22:04.734611 master-2 kubenswrapper[4762]: I1014 13:22:04.734484 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-master-2" podStartSLOduration=3.734414683 podStartE2EDuration="3.734414683s" podCreationTimestamp="2025-10-14 13:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:04.73120636 +0000 UTC m=+953.975365529" watchObservedRunningTime="2025-10-14 13:22:04.734414683 +0000 UTC m=+953.978573842" Oct 14 13:22:05.070769 master-2 kubenswrapper[4762]: I1014 13:22:05.068022 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"65ab73cff95c4d0c3bcb634851a61e62f35a669b1af8c5f450e7184ee9d6e675"} Oct 14 13:22:05.073426 master-2 kubenswrapper[4762]: I1014 13:22:05.072436 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c"} Oct 14 13:22:05.077462 master-2 kubenswrapper[4762]: I1014 13:22:05.077418 4762 generic.go:334] "Generic (PLEG): container finished" podID="eddc1afe-8608-4390-89a1-2cf9b4e772ee" containerID="f1cec86cee3d6c41f35e42c4b86169486166829e3b5aed6017e8691cde08c9dc" exitCode=0 Oct 14 13:22:05.077462 master-2 kubenswrapper[4762]: I1014 13:22:05.077458 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-2" event={"ID":"eddc1afe-8608-4390-89a1-2cf9b4e772ee","Type":"ContainerDied","Data":"f1cec86cee3d6c41f35e42c4b86169486166829e3b5aed6017e8691cde08c9dc"} Oct 14 13:22:05.377412 master-2 kubenswrapper[4762]: I1014 13:22:05.377331 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:22:05.379603 master-2 kubenswrapper[4762]: I1014 13:22:05.379554 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.383624 master-2 kubenswrapper[4762]: I1014 13:22:05.383590 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Oct 14 13:22:05.386918 master-2 kubenswrapper[4762]: I1014 13:22:05.386775 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Oct 14 13:22:05.387515 master-2 kubenswrapper[4762]: I1014 13:22:05.387466 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Oct 14 13:22:05.387615 master-2 kubenswrapper[4762]: I1014 13:22:05.387520 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Oct 14 13:22:05.387679 master-2 kubenswrapper[4762]: I1014 13:22:05.387657 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-dzg65" Oct 14 13:22:05.388004 master-2 kubenswrapper[4762]: I1014 13:22:05.387980 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-8klgi7r2728qp" Oct 14 13:22:05.389484 master-2 kubenswrapper[4762]: I1014 13:22:05.389467 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Oct 14 13:22:05.392586 master-2 kubenswrapper[4762]: I1014 13:22:05.392403 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Oct 14 13:22:05.392586 master-2 kubenswrapper[4762]: I1014 13:22:05.392579 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Oct 14 13:22:05.392760 master-2 kubenswrapper[4762]: I1014 13:22:05.392740 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Oct 14 13:22:05.393293 master-2 kubenswrapper[4762]: I1014 13:22:05.393261 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Oct 14 13:22:05.395974 master-2 kubenswrapper[4762]: I1014 13:22:05.395935 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Oct 14 13:22:05.397731 master-2 kubenswrapper[4762]: I1014 13:22:05.397704 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Oct 14 13:22:05.420823 master-2 kubenswrapper[4762]: I1014 13:22:05.420744 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:22:05.475612 master-2 kubenswrapper[4762]: I1014 13:22:05.475560 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475612 master-2 kubenswrapper[4762]: I1014 13:22:05.475615 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xgk\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475651 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475679 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475704 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475726 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475742 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475758 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475780 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475808 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475851 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.475912 master-2 kubenswrapper[4762]: I1014 13:22:05.475911 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.476921 master-2 kubenswrapper[4762]: I1014 13:22:05.475937 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.476921 master-2 kubenswrapper[4762]: I1014 13:22:05.475958 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.476921 master-2 kubenswrapper[4762]: I1014 13:22:05.475975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.476921 master-2 kubenswrapper[4762]: I1014 13:22:05.475998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.476921 master-2 kubenswrapper[4762]: I1014 13:22:05.476013 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.576998 master-2 kubenswrapper[4762]: I1014 13:22:05.576913 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.576998 master-2 kubenswrapper[4762]: I1014 13:22:05.576986 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xgk\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577014 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577062 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577082 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577101 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577119 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577139 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577189 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577218 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577242 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577328 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.577350 master-2 kubenswrapper[4762]: I1014 13:22:05.577353 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.580053 master-2 kubenswrapper[4762]: I1014 13:22:05.577375 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.580053 master-2 kubenswrapper[4762]: I1014 13:22:05.577402 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.580053 master-2 kubenswrapper[4762]: I1014 13:22:05.577426 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.580053 master-2 kubenswrapper[4762]: I1014 13:22:05.579090 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.580053 master-2 kubenswrapper[4762]: I1014 13:22:05.579420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.582431 master-2 kubenswrapper[4762]: I1014 13:22:05.581269 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.582431 master-2 kubenswrapper[4762]: I1014 13:22:05.581499 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.582431 master-2 kubenswrapper[4762]: I1014 13:22:05.581929 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.582979 master-2 kubenswrapper[4762]: I1014 13:22:05.582578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.582979 master-2 kubenswrapper[4762]: I1014 13:22:05.582600 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.588570 master-2 kubenswrapper[4762]: I1014 13:22:05.588270 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.590597 master-2 kubenswrapper[4762]: I1014 13:22:05.589609 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.593716 master-2 kubenswrapper[4762]: I1014 13:22:05.593680 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.593853 master-2 kubenswrapper[4762]: I1014 13:22:05.593785 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.593970 master-2 kubenswrapper[4762]: I1014 13:22:05.593945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.594019 master-2 kubenswrapper[4762]: I1014 13:22:05.593967 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.594056 master-2 kubenswrapper[4762]: I1014 13:22:05.594022 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.595474 master-2 kubenswrapper[4762]: I1014 13:22:05.595435 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.595571 master-2 kubenswrapper[4762]: I1014 13:22:05.595442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.596567 master-2 kubenswrapper[4762]: I1014 13:22:05.596455 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.600343 master-2 kubenswrapper[4762]: I1014 13:22:05.600249 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xgk\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk\") pod \"prometheus-k8s-0\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:05.696418 master-2 kubenswrapper[4762]: I1014 13:22:05.696284 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:06.085056 master-2 kubenswrapper[4762]: I1014 13:22:06.084996 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"a15f96a3db433f2697c5f19aaebe804fd9963859838d8bfad53372f31f9e3f1a"} Oct 14 13:22:06.085511 master-2 kubenswrapper[4762]: I1014 13:22:06.085063 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"8905e8ae0591cd757161ec28fa200f7f93f61f702396de51658a2f774b0cad60"} Oct 14 13:22:06.337944 master-2 kubenswrapper[4762]: I1014 13:22:06.337907 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:22:06.341366 master-2 kubenswrapper[4762]: W1014 13:22:06.341309 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd766696a_7ad4_4921_b799_c65f51b60109.slice/crio-5ae154f5dd833b785fba37d75186a43bfdf1782fe12b291a92538dd025614985 WatchSource:0}: Error finding container 5ae154f5dd833b785fba37d75186a43bfdf1782fe12b291a92538dd025614985: Status 404 returned error can't find the container with id 5ae154f5dd833b785fba37d75186a43bfdf1782fe12b291a92538dd025614985 Oct 14 13:22:06.369904 master-2 kubenswrapper[4762]: I1014 13:22:06.369864 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:06.491255 master-2 kubenswrapper[4762]: I1014 13:22:06.491198 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access\") pod \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " Oct 14 13:22:06.491457 master-2 kubenswrapper[4762]: I1014 13:22:06.491268 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir\") pod \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\" (UID: \"eddc1afe-8608-4390-89a1-2cf9b4e772ee\") " Oct 14 13:22:06.491627 master-2 kubenswrapper[4762]: I1014 13:22:06.491582 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eddc1afe-8608-4390-89a1-2cf9b4e772ee" (UID: "eddc1afe-8608-4390-89a1-2cf9b4e772ee"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:22:06.517028 master-2 kubenswrapper[4762]: I1014 13:22:06.516966 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eddc1afe-8608-4390-89a1-2cf9b4e772ee" (UID: "eddc1afe-8608-4390-89a1-2cf9b4e772ee"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:06.593183 master-2 kubenswrapper[4762]: I1014 13:22:06.593114 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:06.593183 master-2 kubenswrapper[4762]: I1014 13:22:06.593166 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eddc1afe-8608-4390-89a1-2cf9b4e772ee-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:06.753173 master-2 kubenswrapper[4762]: I1014 13:22:06.752893 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:06.754129 master-2 kubenswrapper[4762]: E1014 13:22:06.753188 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eddc1afe-8608-4390-89a1-2cf9b4e772ee" containerName="pruner" Oct 14 13:22:06.754129 master-2 kubenswrapper[4762]: I1014 13:22:06.753205 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="eddc1afe-8608-4390-89a1-2cf9b4e772ee" containerName="pruner" Oct 14 13:22:06.754129 master-2 kubenswrapper[4762]: I1014 13:22:06.753307 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="eddc1afe-8608-4390-89a1-2cf9b4e772ee" containerName="pruner" Oct 14 13:22:06.754340 master-2 kubenswrapper[4762]: I1014 13:22:06.754287 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.768842 master-2 kubenswrapper[4762]: I1014 13:22:06.768795 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:06.795758 master-2 kubenswrapper[4762]: I1014 13:22:06.795699 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.795758 master-2 kubenswrapper[4762]: I1014 13:22:06.795754 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.796006 master-2 kubenswrapper[4762]: I1014 13:22:06.795795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvtmc\" (UniqueName: \"kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.897918 master-2 kubenswrapper[4762]: I1014 13:22:06.897788 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvtmc\" (UniqueName: \"kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.897918 master-2 kubenswrapper[4762]: I1014 13:22:06.897882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.897918 master-2 kubenswrapper[4762]: I1014 13:22:06.897905 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.898375 master-2 kubenswrapper[4762]: I1014 13:22:06.898354 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.898454 master-2 kubenswrapper[4762]: I1014 13:22:06.898394 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.920957 master-2 kubenswrapper[4762]: I1014 13:22:06.920911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvtmc\" (UniqueName: \"kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc\") pod \"certified-operators-r4wbf\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:06.947046 master-2 kubenswrapper[4762]: I1014 13:22:06.946953 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:06.950030 master-2 kubenswrapper[4762]: I1014 13:22:06.948181 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:06.968854 master-2 kubenswrapper[4762]: I1014 13:22:06.968793 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:07.002359 master-2 kubenswrapper[4762]: I1014 13:22:07.001843 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.002359 master-2 kubenswrapper[4762]: I1014 13:22:07.001892 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.002359 master-2 kubenswrapper[4762]: I1014 13:22:07.001923 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z8kv\" (UniqueName: \"kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.077594 master-2 kubenswrapper[4762]: I1014 13:22:07.077448 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.102627 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerStarted","Data":"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae"} Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.102785 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z8kv\" (UniqueName: \"kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.102920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.102942 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.103404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.108093 master-2 kubenswrapper[4762]: I1014 13:22:07.103674 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.121185 master-2 kubenswrapper[4762]: I1014 13:22:07.110212 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-2" event={"ID":"eddc1afe-8608-4390-89a1-2cf9b4e772ee","Type":"ContainerDied","Data":"7f512a7298dc24097ca3a8943ec576c0d358368dbae66d16318abeacb8135c10"} Oct 14 13:22:07.121185 master-2 kubenswrapper[4762]: I1014 13:22:07.110283 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f512a7298dc24097ca3a8943ec576c0d358368dbae66d16318abeacb8135c10" Oct 14 13:22:07.121185 master-2 kubenswrapper[4762]: I1014 13:22:07.110277 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-2" Oct 14 13:22:07.123483 master-2 kubenswrapper[4762]: I1014 13:22:07.122658 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="ae7cfe6e99c05c0f56eb6f0718dd8851e4401afd6099b55e40eb235f29c7b3ce" exitCode=0 Oct 14 13:22:07.123483 master-2 kubenswrapper[4762]: I1014 13:22:07.122869 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"ae7cfe6e99c05c0f56eb6f0718dd8851e4401afd6099b55e40eb235f29c7b3ce"} Oct 14 13:22:07.123483 master-2 kubenswrapper[4762]: I1014 13:22:07.122925 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"5ae154f5dd833b785fba37d75186a43bfdf1782fe12b291a92538dd025614985"} Oct 14 13:22:07.145970 master-2 kubenswrapper[4762]: I1014 13:22:07.144061 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z8kv\" (UniqueName: \"kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv\") pod \"community-operators-thpzb\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.146201 master-2 kubenswrapper[4762]: I1014 13:22:07.146084 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"dc4173cad493dd0c8272146cba1fc6575a326e6f2aeb82ed256ce45ec0759453"} Oct 14 13:22:07.146201 master-2 kubenswrapper[4762]: I1014 13:22:07.146132 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"fe0ba52cdfbbfc69de112ad1e30516fd9d39488ad3cd0fd96ca798765f99ec78"} Oct 14 13:22:07.146201 master-2 kubenswrapper[4762]: I1014 13:22:07.146146 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" event={"ID":"12625659-53e4-4ae4-837b-f5178cfe2681","Type":"ContainerStarted","Data":"20282a3e80127cbc57302f0f24add2927076c54b41a4272e074edb7a62e6570c"} Oct 14 13:22:07.150789 master-2 kubenswrapper[4762]: I1014 13:22:07.147017 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:07.191675 master-2 kubenswrapper[4762]: I1014 13:22:07.191614 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.262048903 podStartE2EDuration="9.191596896s" podCreationTimestamp="2025-10-14 13:21:58 +0000 UTC" firstStartedPulling="2025-10-14 13:22:00.013647807 +0000 UTC m=+949.257806966" lastFinishedPulling="2025-10-14 13:22:05.9431958 +0000 UTC m=+955.187354959" observedRunningTime="2025-10-14 13:22:07.151051417 +0000 UTC m=+956.395210586" watchObservedRunningTime="2025-10-14 13:22:07.191596896 +0000 UTC m=+956.435756055" Oct 14 13:22:07.193693 master-2 kubenswrapper[4762]: I1014 13:22:07.193620 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" podStartSLOduration=2.847845901 podStartE2EDuration="8.19360986s" podCreationTimestamp="2025-10-14 13:21:59 +0000 UTC" firstStartedPulling="2025-10-14 13:22:00.655569258 +0000 UTC m=+949.899728417" lastFinishedPulling="2025-10-14 13:22:06.001333217 +0000 UTC m=+955.245492376" observedRunningTime="2025-10-14 13:22:07.191194003 +0000 UTC m=+956.435353172" watchObservedRunningTime="2025-10-14 13:22:07.19360986 +0000 UTC m=+956.437769019" Oct 14 13:22:07.297189 master-2 kubenswrapper[4762]: I1014 13:22:07.295402 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:07.583394 master-2 kubenswrapper[4762]: I1014 13:22:07.583207 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:07.743234 master-2 kubenswrapper[4762]: I1014 13:22:07.743074 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:07.763571 master-2 kubenswrapper[4762]: W1014 13:22:07.761054 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20b78b5_af9a_4448_bb54_656bf15f46aa.slice/crio-c9c33b302f1fd27129aadd0007acd7999f99f686580b71005e7fdf0083324998 WatchSource:0}: Error finding container c9c33b302f1fd27129aadd0007acd7999f99f686580b71005e7fdf0083324998: Status 404 returned error can't find the container with id c9c33b302f1fd27129aadd0007acd7999f99f686580b71005e7fdf0083324998 Oct 14 13:22:08.159335 master-2 kubenswrapper[4762]: I1014 13:22:08.159226 4762 generic.go:334] "Generic (PLEG): container finished" podID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerID="3a0b44633286f49d9da39f4b5591b415714f6cb678be12f5dc360eb288943c02" exitCode=0 Oct 14 13:22:08.159335 master-2 kubenswrapper[4762]: I1014 13:22:08.159300 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerDied","Data":"3a0b44633286f49d9da39f4b5591b415714f6cb678be12f5dc360eb288943c02"} Oct 14 13:22:08.159335 master-2 kubenswrapper[4762]: I1014 13:22:08.159329 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerStarted","Data":"c9c33b302f1fd27129aadd0007acd7999f99f686580b71005e7fdf0083324998"} Oct 14 13:22:08.167175 master-2 kubenswrapper[4762]: I1014 13:22:08.167114 4762 generic.go:334] "Generic (PLEG): container finished" podID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerID="e8c7deda382477e6043e0252d5b8bb2fb58404986824ca0ff95aaf1f43a68318" exitCode=0 Oct 14 13:22:08.167278 master-2 kubenswrapper[4762]: I1014 13:22:08.167216 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerDied","Data":"e8c7deda382477e6043e0252d5b8bb2fb58404986824ca0ff95aaf1f43a68318"} Oct 14 13:22:08.167329 master-2 kubenswrapper[4762]: I1014 13:22:08.167299 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerStarted","Data":"8b4d3f1061f9efdeee347d7a15d5cefccba0c9bbcb064b6c953cdb67824cb9a7"} Oct 14 13:22:09.163566 master-2 kubenswrapper[4762]: I1014 13:22:09.163517 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:09.164893 master-2 kubenswrapper[4762]: I1014 13:22:09.164743 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.181920 master-2 kubenswrapper[4762]: I1014 13:22:09.180183 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:09.203677 master-2 kubenswrapper[4762]: I1014 13:22:09.203450 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:22:09.219140 master-2 kubenswrapper[4762]: I1014 13:22:09.217407 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerStarted","Data":"a9ba7983263e2cbb97fb6e6035eebb225d2ea8a7e01147e540c181108f9177df"} Oct 14 13:22:09.223244 master-2 kubenswrapper[4762]: I1014 13:22:09.221754 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerStarted","Data":"014125887af0c303183c1f0290b20bac2f8f7e585d0b28252c1a25b923d7a99d"} Oct 14 13:22:09.257225 master-2 kubenswrapper[4762]: I1014 13:22:09.256282 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.257225 master-2 kubenswrapper[4762]: I1014 13:22:09.256406 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpvp\" (UniqueName: \"kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.257225 master-2 kubenswrapper[4762]: I1014 13:22:09.256440 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.350846 master-2 kubenswrapper[4762]: I1014 13:22:09.350799 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:09.352009 master-2 kubenswrapper[4762]: I1014 13:22:09.351973 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.358464 master-2 kubenswrapper[4762]: I1014 13:22:09.358429 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpvp\" (UniqueName: \"kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.358709 master-2 kubenswrapper[4762]: I1014 13:22:09.358499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.358709 master-2 kubenswrapper[4762]: I1014 13:22:09.358590 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.359672 master-2 kubenswrapper[4762]: I1014 13:22:09.359517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.360920 master-2 kubenswrapper[4762]: I1014 13:22:09.359907 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.376144 master-2 kubenswrapper[4762]: I1014 13:22:09.375558 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:09.397337 master-2 kubenswrapper[4762]: I1014 13:22:09.396927 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpvp\" (UniqueName: \"kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp\") pod \"redhat-marketplace-7dljg\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.461501 master-2 kubenswrapper[4762]: I1014 13:22:09.460280 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.461501 master-2 kubenswrapper[4762]: I1014 13:22:09.460327 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsgt\" (UniqueName: \"kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.461501 master-2 kubenswrapper[4762]: I1014 13:22:09.460352 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.500518 master-2 kubenswrapper[4762]: I1014 13:22:09.500473 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:09.562824 master-2 kubenswrapper[4762]: I1014 13:22:09.562774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.562824 master-2 kubenswrapper[4762]: I1014 13:22:09.562817 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsgt\" (UniqueName: \"kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.563080 master-2 kubenswrapper[4762]: I1014 13:22:09.562850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.563363 master-2 kubenswrapper[4762]: I1014 13:22:09.563319 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.573992 master-2 kubenswrapper[4762]: I1014 13:22:09.563593 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.596992 master-2 kubenswrapper[4762]: I1014 13:22:09.596923 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsgt\" (UniqueName: \"kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt\") pod \"redhat-operators-hh4tw\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.676194 master-2 kubenswrapper[4762]: I1014 13:22:09.670376 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:09.978258 master-2 kubenswrapper[4762]: I1014 13:22:09.977954 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:10.246256 master-2 kubenswrapper[4762]: I1014 13:22:10.241200 4762 generic.go:334] "Generic (PLEG): container finished" podID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerID="a9ba7983263e2cbb97fb6e6035eebb225d2ea8a7e01147e540c181108f9177df" exitCode=0 Oct 14 13:22:10.246256 master-2 kubenswrapper[4762]: I1014 13:22:10.241276 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerDied","Data":"a9ba7983263e2cbb97fb6e6035eebb225d2ea8a7e01147e540c181108f9177df"} Oct 14 13:22:10.271236 master-2 kubenswrapper[4762]: I1014 13:22:10.269991 4762 generic.go:334] "Generic (PLEG): container finished" podID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerID="014125887af0c303183c1f0290b20bac2f8f7e585d0b28252c1a25b923d7a99d" exitCode=0 Oct 14 13:22:10.271236 master-2 kubenswrapper[4762]: I1014 13:22:10.270055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerDied","Data":"014125887af0c303183c1f0290b20bac2f8f7e585d0b28252c1a25b923d7a99d"} Oct 14 13:22:10.282197 master-2 kubenswrapper[4762]: I1014 13:22:10.275748 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-cc99494f6-kmmxc" Oct 14 13:22:12.915785 master-2 kubenswrapper[4762]: I1014 13:22:12.915723 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:13.298798 master-2 kubenswrapper[4762]: I1014 13:22:13.298749 4762 generic.go:334] "Generic (PLEG): container finished" podID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerID="7181532e55cd94c52d06dbe19612850de2e380968992a3bb099fccb8b2d38cf1" exitCode=0 Oct 14 13:22:13.298953 master-2 kubenswrapper[4762]: I1014 13:22:13.298863 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerDied","Data":"7181532e55cd94c52d06dbe19612850de2e380968992a3bb099fccb8b2d38cf1"} Oct 14 13:22:13.299016 master-2 kubenswrapper[4762]: I1014 13:22:13.298959 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerStarted","Data":"fafa1c9d24d02ab7489f2d03dca077dd3cf237a125f1fa9e2bcaf98a4b606d7c"} Oct 14 13:22:13.304517 master-2 kubenswrapper[4762]: I1014 13:22:13.304470 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"8cb6e7f57317f98acf1f86c1f4624231ddf922c2321b7ca8f5ee56a1e4c66274"} Oct 14 13:22:13.304517 master-2 kubenswrapper[4762]: I1014 13:22:13.304504 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"9a307c52f472dbea0756ef79dc081e7c7f818a13fbf1f1667281a7a4465acb42"} Oct 14 13:22:13.304582 master-2 kubenswrapper[4762]: I1014 13:22:13.304516 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"68586ae70faec91fa7469971b38394705160337f7f18cb0eff9978037eda6496"} Oct 14 13:22:13.304582 master-2 kubenswrapper[4762]: I1014 13:22:13.304529 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"3b512fead26f68b391af64674e86d0e7db890a4f2e2def8b93b1f890a60f4978"} Oct 14 13:22:13.304582 master-2 kubenswrapper[4762]: I1014 13:22:13.304538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f"} Oct 14 13:22:13.306419 master-2 kubenswrapper[4762]: I1014 13:22:13.306286 4762 generic.go:334] "Generic (PLEG): container finished" podID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerID="64be655332c62ebc2e7ad8d40a87c2869ad8ff9bf10d5edf62e6e8ad808f69cd" exitCode=0 Oct 14 13:22:13.306419 master-2 kubenswrapper[4762]: I1014 13:22:13.306419 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerDied","Data":"64be655332c62ebc2e7ad8d40a87c2869ad8ff9bf10d5edf62e6e8ad808f69cd"} Oct 14 13:22:13.306419 master-2 kubenswrapper[4762]: I1014 13:22:13.306453 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerStarted","Data":"de750e5e34687f467bae486a4deddde1d5dd7e7c825f92b236f7aff633e20c0f"} Oct 14 13:22:13.310563 master-2 kubenswrapper[4762]: I1014 13:22:13.310507 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerStarted","Data":"a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133"} Oct 14 13:22:13.318852 master-2 kubenswrapper[4762]: I1014 13:22:13.315286 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerStarted","Data":"4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326"} Oct 14 13:22:13.387764 master-2 kubenswrapper[4762]: I1014 13:22:13.387644 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-thpzb" podStartSLOduration=2.7843401180000003 podStartE2EDuration="7.387617298s" podCreationTimestamp="2025-10-14 13:22:06 +0000 UTC" firstStartedPulling="2025-10-14 13:22:08.161086239 +0000 UTC m=+957.405245398" lastFinishedPulling="2025-10-14 13:22:12.764363419 +0000 UTC m=+962.008522578" observedRunningTime="2025-10-14 13:22:13.377105073 +0000 UTC m=+962.621264232" watchObservedRunningTime="2025-10-14 13:22:13.387617298 +0000 UTC m=+962.631776457" Oct 14 13:22:13.406060 master-2 kubenswrapper[4762]: I1014 13:22:13.405968 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r4wbf" podStartSLOduration=3.156540675 podStartE2EDuration="7.40594643s" podCreationTimestamp="2025-10-14 13:22:06 +0000 UTC" firstStartedPulling="2025-10-14 13:22:08.169563727 +0000 UTC m=+957.413722886" lastFinishedPulling="2025-10-14 13:22:12.418969482 +0000 UTC m=+961.663128641" observedRunningTime="2025-10-14 13:22:13.402902323 +0000 UTC m=+962.647061492" watchObservedRunningTime="2025-10-14 13:22:13.40594643 +0000 UTC m=+962.650105599" Oct 14 13:22:14.331898 master-2 kubenswrapper[4762]: I1014 13:22:14.331804 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerStarted","Data":"d8f070111070b462f8f13f371a942eca727e9566df5b02713259d59ac85ce22f"} Oct 14 13:22:14.337948 master-2 kubenswrapper[4762]: I1014 13:22:14.337706 4762 generic.go:334] "Generic (PLEG): container finished" podID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerID="61a491ade1dbbb1d673a72b45084afd30df36675190bfec4cbf0368079ac8950" exitCode=0 Oct 14 13:22:14.337948 master-2 kubenswrapper[4762]: I1014 13:22:14.337844 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerDied","Data":"61a491ade1dbbb1d673a72b45084afd30df36675190bfec4cbf0368079ac8950"} Oct 14 13:22:14.346430 master-2 kubenswrapper[4762]: I1014 13:22:14.344742 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerStarted","Data":"572aea723a01273b8c89bc2d0322d30c45f4601cb9a1fb063f1ae4afb4dfc654"} Oct 14 13:22:14.538703 master-2 kubenswrapper[4762]: I1014 13:22:14.538218 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.242928041 podStartE2EDuration="9.538195445s" podCreationTimestamp="2025-10-14 13:22:05 +0000 UTC" firstStartedPulling="2025-10-14 13:22:07.125526166 +0000 UTC m=+956.369685325" lastFinishedPulling="2025-10-14 13:22:12.42079357 +0000 UTC m=+961.664952729" observedRunningTime="2025-10-14 13:22:14.513354465 +0000 UTC m=+963.757513624" watchObservedRunningTime="2025-10-14 13:22:14.538195445 +0000 UTC m=+963.782354644" Oct 14 13:22:15.357223 master-2 kubenswrapper[4762]: I1014 13:22:15.357136 4762 generic.go:334] "Generic (PLEG): container finished" podID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerID="d8f070111070b462f8f13f371a942eca727e9566df5b02713259d59ac85ce22f" exitCode=0 Oct 14 13:22:15.357223 master-2 kubenswrapper[4762]: I1014 13:22:15.357223 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerDied","Data":"d8f070111070b462f8f13f371a942eca727e9566df5b02713259d59ac85ce22f"} Oct 14 13:22:15.362094 master-2 kubenswrapper[4762]: I1014 13:22:15.362011 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerStarted","Data":"f13b009b43015da714426401052be6f98db5ab81b99a240eed47527a08ba9297"} Oct 14 13:22:15.404539 master-2 kubenswrapper[4762]: I1014 13:22:15.404457 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7dljg" podStartSLOduration=4.894303891 podStartE2EDuration="6.404433765s" podCreationTimestamp="2025-10-14 13:22:09 +0000 UTC" firstStartedPulling="2025-10-14 13:22:13.300773828 +0000 UTC m=+962.544932987" lastFinishedPulling="2025-10-14 13:22:14.810903702 +0000 UTC m=+964.055062861" observedRunningTime="2025-10-14 13:22:15.402482764 +0000 UTC m=+964.646641943" watchObservedRunningTime="2025-10-14 13:22:15.404433765 +0000 UTC m=+964.648592944" Oct 14 13:22:15.697640 master-2 kubenswrapper[4762]: I1014 13:22:15.697579 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:22:17.078941 master-2 kubenswrapper[4762]: I1014 13:22:17.078874 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:17.079652 master-2 kubenswrapper[4762]: I1014 13:22:17.078967 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:17.138659 master-2 kubenswrapper[4762]: I1014 13:22:17.138604 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:17.296952 master-2 kubenswrapper[4762]: I1014 13:22:17.296902 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:17.296952 master-2 kubenswrapper[4762]: I1014 13:22:17.296957 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:17.347270 master-2 kubenswrapper[4762]: I1014 13:22:17.346925 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:17.419808 master-2 kubenswrapper[4762]: I1014 13:22:17.419726 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:17.423422 master-2 kubenswrapper[4762]: I1014 13:22:17.423378 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:19.393272 master-2 kubenswrapper[4762]: I1014 13:22:19.393143 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7b6b7bb859-vrzvk_e19cccdb-ac9b-4919-85d8-d7ae33d2d003/multus-admission-controller/0.log" Oct 14 13:22:19.393272 master-2 kubenswrapper[4762]: I1014 13:22:19.393244 4762 generic.go:334] "Generic (PLEG): container finished" podID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerID="b4d790b1636493087694498f992b9121f3a37a50f6ab46979d6eb4f576d882ee" exitCode=137 Oct 14 13:22:19.393832 master-2 kubenswrapper[4762]: I1014 13:22:19.393288 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerDied","Data":"b4d790b1636493087694498f992b9121f3a37a50f6ab46979d6eb4f576d882ee"} Oct 14 13:22:19.500897 master-2 kubenswrapper[4762]: I1014 13:22:19.500806 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:19.501131 master-2 kubenswrapper[4762]: I1014 13:22:19.501097 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:19.562224 master-2 kubenswrapper[4762]: I1014 13:22:19.562175 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:19.615612 master-2 kubenswrapper[4762]: I1014 13:22:19.615534 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:19.615911 master-2 kubenswrapper[4762]: I1014 13:22:19.615879 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r4wbf" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="registry-server" containerID="cri-o://4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" gracePeriod=2 Oct 14 13:22:20.402130 master-2 kubenswrapper[4762]: I1014 13:22:20.402066 4762 generic.go:334] "Generic (PLEG): container finished" podID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerID="4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" exitCode=0 Oct 14 13:22:20.402704 master-2 kubenswrapper[4762]: I1014 13:22:20.402166 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerDied","Data":"4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326"} Oct 14 13:22:20.446964 master-2 kubenswrapper[4762]: I1014 13:22:20.446899 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:20.588518 master-2 kubenswrapper[4762]: I1014 13:22:20.588348 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:20.588813 master-2 kubenswrapper[4762]: I1014 13:22:20.588580 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-thpzb" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="registry-server" containerID="cri-o://a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" gracePeriod=2 Oct 14 13:22:21.418658 master-2 kubenswrapper[4762]: I1014 13:22:21.418594 4762 generic.go:334] "Generic (PLEG): container finished" podID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerID="a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" exitCode=0 Oct 14 13:22:21.419569 master-2 kubenswrapper[4762]: I1014 13:22:21.418676 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerDied","Data":"a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133"} Oct 14 13:22:21.938920 master-2 kubenswrapper[4762]: I1014 13:22:21.938850 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:23.433254 master-2 kubenswrapper[4762]: I1014 13:22:23.433195 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7dljg" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="registry-server" containerID="cri-o://f13b009b43015da714426401052be6f98db5ab81b99a240eed47527a08ba9297" gracePeriod=2 Oct 14 13:22:24.440400 master-2 kubenswrapper[4762]: I1014 13:22:24.440332 4762 generic.go:334] "Generic (PLEG): container finished" podID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerID="f13b009b43015da714426401052be6f98db5ab81b99a240eed47527a08ba9297" exitCode=0 Oct 14 13:22:24.440400 master-2 kubenswrapper[4762]: I1014 13:22:24.440391 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerDied","Data":"f13b009b43015da714426401052be6f98db5ab81b99a240eed47527a08ba9297"} Oct 14 13:22:25.048250 master-2 kubenswrapper[4762]: I1014 13:22:25.048104 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-668956f9dd-llkhv" podUID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" containerName="console" containerID="cri-o://7129000b66a3dca2c1c6b420d9199bde8dc73c1692662b1ef971cd750d195b34" gracePeriod=15 Oct 14 13:22:25.450519 master-2 kubenswrapper[4762]: I1014 13:22:25.450461 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668956f9dd-llkhv_5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3/console/0.log" Oct 14 13:22:25.451098 master-2 kubenswrapper[4762]: I1014 13:22:25.450532 4762 generic.go:334] "Generic (PLEG): container finished" podID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" containerID="7129000b66a3dca2c1c6b420d9199bde8dc73c1692662b1ef971cd750d195b34" exitCode=2 Oct 14 13:22:25.451098 master-2 kubenswrapper[4762]: I1014 13:22:25.450572 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668956f9dd-llkhv" event={"ID":"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3","Type":"ContainerDied","Data":"7129000b66a3dca2c1c6b420d9199bde8dc73c1692662b1ef971cd750d195b34"} Oct 14 13:22:26.807076 master-2 kubenswrapper[4762]: I1014 13:22:26.806949 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-76c4979bdc-gds6w" Oct 14 13:22:27.079406 master-2 kubenswrapper[4762]: E1014 13:22:27.079238 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326 is running failed: container process not found" containerID="4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.079897 master-2 kubenswrapper[4762]: E1014 13:22:27.079829 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326 is running failed: container process not found" containerID="4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.080497 master-2 kubenswrapper[4762]: E1014 13:22:27.080459 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326 is running failed: container process not found" containerID="4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.080663 master-2 kubenswrapper[4762]: E1014 13:22:27.080499 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-r4wbf" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="registry-server" Oct 14 13:22:27.297262 master-2 kubenswrapper[4762]: E1014 13:22:27.297199 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133 is running failed: container process not found" containerID="a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.297614 master-2 kubenswrapper[4762]: E1014 13:22:27.297535 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133 is running failed: container process not found" containerID="a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.298012 master-2 kubenswrapper[4762]: E1014 13:22:27.297844 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133 is running failed: container process not found" containerID="a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" cmd=["grpc_health_probe","-addr=:50051"] Oct 14 13:22:27.298012 master-2 kubenswrapper[4762]: E1014 13:22:27.297876 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-thpzb" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="registry-server" Oct 14 13:22:28.934030 master-2 kubenswrapper[4762]: I1014 13:22:28.933989 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:29.024561 master-2 kubenswrapper[4762]: I1014 13:22:29.024469 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7b6b7bb859-vrzvk_e19cccdb-ac9b-4919-85d8-d7ae33d2d003/multus-admission-controller/0.log" Oct 14 13:22:29.024658 master-2 kubenswrapper[4762]: I1014 13:22:29.024569 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:22:29.041201 master-2 kubenswrapper[4762]: I1014 13:22:29.041090 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content\") pod \"cb0ff895-b205-40a6-9415-3160ea26c19c\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " Oct 14 13:22:29.041695 master-2 kubenswrapper[4762]: I1014 13:22:29.041254 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities\") pod \"cb0ff895-b205-40a6-9415-3160ea26c19c\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " Oct 14 13:22:29.041695 master-2 kubenswrapper[4762]: I1014 13:22:29.041325 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvtmc\" (UniqueName: \"kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc\") pod \"cb0ff895-b205-40a6-9415-3160ea26c19c\" (UID: \"cb0ff895-b205-40a6-9415-3160ea26c19c\") " Oct 14 13:22:29.042205 master-2 kubenswrapper[4762]: I1014 13:22:29.042136 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities" (OuterVolumeSpecName: "utilities") pod "cb0ff895-b205-40a6-9415-3160ea26c19c" (UID: "cb0ff895-b205-40a6-9415-3160ea26c19c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.045823 master-2 kubenswrapper[4762]: I1014 13:22:29.045758 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc" (OuterVolumeSpecName: "kube-api-access-jvtmc") pod "cb0ff895-b205-40a6-9415-3160ea26c19c" (UID: "cb0ff895-b205-40a6-9415-3160ea26c19c"). InnerVolumeSpecName "kube-api-access-jvtmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:29.084765 master-2 kubenswrapper[4762]: I1014 13:22:29.084713 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb0ff895-b205-40a6-9415-3160ea26c19c" (UID: "cb0ff895-b205-40a6-9415-3160ea26c19c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.142617 master-2 kubenswrapper[4762]: I1014 13:22:29.142552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9t24\" (UniqueName: \"kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24\") pod \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " Oct 14 13:22:29.142872 master-2 kubenswrapper[4762]: I1014 13:22:29.142641 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs\") pod \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\" (UID: \"e19cccdb-ac9b-4919-85d8-d7ae33d2d003\") " Oct 14 13:22:29.143072 master-2 kubenswrapper[4762]: I1014 13:22:29.143044 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.143072 master-2 kubenswrapper[4762]: I1014 13:22:29.143066 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvtmc\" (UniqueName: \"kubernetes.io/projected/cb0ff895-b205-40a6-9415-3160ea26c19c-kube-api-access-jvtmc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.143173 master-2 kubenswrapper[4762]: I1014 13:22:29.143077 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0ff895-b205-40a6-9415-3160ea26c19c-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.146769 master-2 kubenswrapper[4762]: I1014 13:22:29.146745 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "e19cccdb-ac9b-4919-85d8-d7ae33d2d003" (UID: "e19cccdb-ac9b-4919-85d8-d7ae33d2d003"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:29.146951 master-2 kubenswrapper[4762]: I1014 13:22:29.146906 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24" (OuterVolumeSpecName: "kube-api-access-j9t24") pod "e19cccdb-ac9b-4919-85d8-d7ae33d2d003" (UID: "e19cccdb-ac9b-4919-85d8-d7ae33d2d003"). InnerVolumeSpecName "kube-api-access-j9t24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:29.149357 master-2 kubenswrapper[4762]: I1014 13:22:29.149316 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:29.174876 master-2 kubenswrapper[4762]: I1014 13:22:29.174835 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:29.177920 master-2 kubenswrapper[4762]: I1014 13:22:29.177869 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668956f9dd-llkhv_5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3/console/0.log" Oct 14 13:22:29.178008 master-2 kubenswrapper[4762]: I1014 13:22:29.177955 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:22:29.234880 master-2 kubenswrapper[4762]: I1014 13:22:29.234809 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:22:29.244563 master-2 kubenswrapper[4762]: I1014 13:22:29.244512 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9t24\" (UniqueName: \"kubernetes.io/projected/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-kube-api-access-j9t24\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.244667 master-2 kubenswrapper[4762]: I1014 13:22:29.244578 4762 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e19cccdb-ac9b-4919-85d8-d7ae33d2d003-webhook-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.346840 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.346930 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.346990 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z8kv\" (UniqueName: \"kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv\") pod \"b20b78b5-af9a-4448-bb54-656bf15f46aa\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347026 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzpvp\" (UniqueName: \"kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp\") pod \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347093 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content\") pod \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347151 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities\") pod \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\" (UID: \"fd256e17-f8ff-4f1d-a2b0-27168fc5262a\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347274 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347305 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347326 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities\") pod \"b20b78b5-af9a-4448-bb54-656bf15f46aa\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347359 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content\") pod \"b20b78b5-af9a-4448-bb54-656bf15f46aa\" (UID: \"b20b78b5-af9a-4448-bb54-656bf15f46aa\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347378 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mn89\" (UniqueName: \"kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.347474 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config\") pod \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\" (UID: \"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3\") " Oct 14 13:22:29.349812 master-2 kubenswrapper[4762]: I1014 13:22:29.349662 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config" (OuterVolumeSpecName: "console-config") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:29.352278 master-2 kubenswrapper[4762]: I1014 13:22:29.351237 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca" (OuterVolumeSpecName: "service-ca") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:29.352278 master-2 kubenswrapper[4762]: I1014 13:22:29.352228 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:29.352606 master-2 kubenswrapper[4762]: I1014 13:22:29.352571 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities" (OuterVolumeSpecName: "utilities") pod "fd256e17-f8ff-4f1d-a2b0-27168fc5262a" (UID: "fd256e17-f8ff-4f1d-a2b0-27168fc5262a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.352762 master-2 kubenswrapper[4762]: I1014 13:22:29.352707 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities" (OuterVolumeSpecName: "utilities") pod "b20b78b5-af9a-4448-bb54-656bf15f46aa" (UID: "b20b78b5-af9a-4448-bb54-656bf15f46aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.354057 master-2 kubenswrapper[4762]: I1014 13:22:29.353994 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:29.354057 master-2 kubenswrapper[4762]: I1014 13:22:29.354012 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp" (OuterVolumeSpecName: "kube-api-access-fzpvp") pod "fd256e17-f8ff-4f1d-a2b0-27168fc5262a" (UID: "fd256e17-f8ff-4f1d-a2b0-27168fc5262a"). InnerVolumeSpecName "kube-api-access-fzpvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:29.354057 master-2 kubenswrapper[4762]: I1014 13:22:29.354042 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv" (OuterVolumeSpecName: "kube-api-access-5z8kv") pod "b20b78b5-af9a-4448-bb54-656bf15f46aa" (UID: "b20b78b5-af9a-4448-bb54-656bf15f46aa"). InnerVolumeSpecName "kube-api-access-5z8kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:29.354999 master-2 kubenswrapper[4762]: I1014 13:22:29.354944 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89" (OuterVolumeSpecName: "kube-api-access-6mn89") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "kube-api-access-6mn89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:29.355541 master-2 kubenswrapper[4762]: I1014 13:22:29.355464 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" (UID: "5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:29.366705 master-2 kubenswrapper[4762]: I1014 13:22:29.366632 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd256e17-f8ff-4f1d-a2b0-27168fc5262a" (UID: "fd256e17-f8ff-4f1d-a2b0-27168fc5262a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.411182 master-2 kubenswrapper[4762]: I1014 13:22:29.411093 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b20b78b5-af9a-4448-bb54-656bf15f46aa" (UID: "b20b78b5-af9a-4448-bb54-656bf15f46aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451866 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-service-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451922 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-oauth-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451942 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451955 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b20b78b5-af9a-4448-bb54-656bf15f46aa-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451967 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mn89\" (UniqueName: \"kubernetes.io/projected/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-kube-api-access-6mn89\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.451959 master-2 kubenswrapper[4762]: I1014 13:22:29.451979 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.451993 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.452010 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3-console-oauth-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.452027 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z8kv\" (UniqueName: \"kubernetes.io/projected/b20b78b5-af9a-4448-bb54-656bf15f46aa-kube-api-access-5z8kv\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.452043 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzpvp\" (UniqueName: \"kubernetes.io/projected/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-kube-api-access-fzpvp\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.452059 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.452513 master-2 kubenswrapper[4762]: I1014 13:22:29.452074 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd256e17-f8ff-4f1d-a2b0-27168fc5262a-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:29.481209 master-2 kubenswrapper[4762]: I1014 13:22:29.481133 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7dljg" Oct 14 13:22:29.481340 master-2 kubenswrapper[4762]: I1014 13:22:29.481120 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7dljg" event={"ID":"fd256e17-f8ff-4f1d-a2b0-27168fc5262a","Type":"ContainerDied","Data":"fafa1c9d24d02ab7489f2d03dca077dd3cf237a125f1fa9e2bcaf98a4b606d7c"} Oct 14 13:22:29.481340 master-2 kubenswrapper[4762]: I1014 13:22:29.481283 4762 scope.go:117] "RemoveContainer" containerID="f13b009b43015da714426401052be6f98db5ab81b99a240eed47527a08ba9297" Oct 14 13:22:29.482778 master-2 kubenswrapper[4762]: I1014 13:22:29.482655 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668956f9dd-llkhv_5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3/console/0.log" Oct 14 13:22:29.482778 master-2 kubenswrapper[4762]: I1014 13:22:29.482721 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668956f9dd-llkhv" Oct 14 13:22:29.482898 master-2 kubenswrapper[4762]: I1014 13:22:29.482716 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668956f9dd-llkhv" event={"ID":"5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3","Type":"ContainerDied","Data":"9c26b92861f7ceb2c1298738b1da256332fd54464d0a2e6cf4ba71ee617daf3e"} Oct 14 13:22:29.485215 master-2 kubenswrapper[4762]: I1014 13:22:29.484548 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7b6b7bb859-vrzvk_e19cccdb-ac9b-4919-85d8-d7ae33d2d003/multus-admission-controller/0.log" Oct 14 13:22:29.485455 master-2 kubenswrapper[4762]: I1014 13:22:29.485428 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" Oct 14 13:22:29.485455 master-2 kubenswrapper[4762]: I1014 13:22:29.485414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk" event={"ID":"e19cccdb-ac9b-4919-85d8-d7ae33d2d003","Type":"ContainerDied","Data":"830f07bc415dab46f5d47ff4e752b5723c9ad2b1073b43f1e1ddd13a0813a515"} Oct 14 13:22:29.488607 master-2 kubenswrapper[4762]: I1014 13:22:29.488568 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-thpzb" event={"ID":"b20b78b5-af9a-4448-bb54-656bf15f46aa","Type":"ContainerDied","Data":"c9c33b302f1fd27129aadd0007acd7999f99f686580b71005e7fdf0083324998"} Oct 14 13:22:29.488681 master-2 kubenswrapper[4762]: I1014 13:22:29.488668 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-thpzb" Oct 14 13:22:29.497598 master-2 kubenswrapper[4762]: I1014 13:22:29.497557 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4wbf" event={"ID":"cb0ff895-b205-40a6-9415-3160ea26c19c","Type":"ContainerDied","Data":"8b4d3f1061f9efdeee347d7a15d5cefccba0c9bbcb064b6c953cdb67824cb9a7"} Oct 14 13:22:29.497759 master-2 kubenswrapper[4762]: I1014 13:22:29.497703 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4wbf" Oct 14 13:22:29.505373 master-2 kubenswrapper[4762]: I1014 13:22:29.505341 4762 scope.go:117] "RemoveContainer" containerID="61a491ade1dbbb1d673a72b45084afd30df36675190bfec4cbf0368079ac8950" Oct 14 13:22:29.565833 master-2 kubenswrapper[4762]: I1014 13:22:29.565775 4762 scope.go:117] "RemoveContainer" containerID="7181532e55cd94c52d06dbe19612850de2e380968992a3bb099fccb8b2d38cf1" Oct 14 13:22:29.598453 master-2 kubenswrapper[4762]: I1014 13:22:29.598404 4762 scope.go:117] "RemoveContainer" containerID="7129000b66a3dca2c1c6b420d9199bde8dc73c1692662b1ef971cd750d195b34" Oct 14 13:22:29.618919 master-2 kubenswrapper[4762]: I1014 13:22:29.618870 4762 scope.go:117] "RemoveContainer" containerID="c8dfd15a32985bd086d3b821b1137ea95df04ba46fa985bc8180bf62869ea7f9" Oct 14 13:22:29.634330 master-2 kubenswrapper[4762]: I1014 13:22:29.634300 4762 scope.go:117] "RemoveContainer" containerID="b4d790b1636493087694498f992b9121f3a37a50f6ab46979d6eb4f576d882ee" Oct 14 13:22:29.649127 master-2 kubenswrapper[4762]: I1014 13:22:29.649065 4762 scope.go:117] "RemoveContainer" containerID="a24099ad7badd1258d3aa393520ac4b97128c5c19905baf28485e28594f56133" Oct 14 13:22:29.663609 master-2 kubenswrapper[4762]: I1014 13:22:29.663527 4762 scope.go:117] "RemoveContainer" containerID="a9ba7983263e2cbb97fb6e6035eebb225d2ea8a7e01147e540c181108f9177df" Oct 14 13:22:29.680745 master-2 kubenswrapper[4762]: I1014 13:22:29.680705 4762 scope.go:117] "RemoveContainer" containerID="3a0b44633286f49d9da39f4b5591b415714f6cb678be12f5dc360eb288943c02" Oct 14 13:22:29.698711 master-2 kubenswrapper[4762]: I1014 13:22:29.698659 4762 scope.go:117] "RemoveContainer" containerID="4c2c242407cf6d6fc6dc1cd76cf7393fb785eeb9caf34c198139065fca8e2326" Oct 14 13:22:29.716689 master-2 kubenswrapper[4762]: I1014 13:22:29.716607 4762 scope.go:117] "RemoveContainer" containerID="014125887af0c303183c1f0290b20bac2f8f7e585d0b28252c1a25b923d7a99d" Oct 14 13:22:29.738419 master-2 kubenswrapper[4762]: I1014 13:22:29.738287 4762 scope.go:117] "RemoveContainer" containerID="e8c7deda382477e6043e0252d5b8bb2fb58404986824ca0ff95aaf1f43a68318" Oct 14 13:22:32.008266 master-2 kubenswrapper[4762]: I1014 13:22:32.007745 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:22:32.067647 master-2 kubenswrapper[4762]: I1014 13:22:32.067560 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-668956f9dd-llkhv"] Oct 14 13:22:32.163498 master-2 kubenswrapper[4762]: I1014 13:22:32.163464 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:32.177804 master-2 kubenswrapper[4762]: I1014 13:22:32.177725 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r4wbf"] Oct 14 13:22:32.245069 master-2 kubenswrapper[4762]: I1014 13:22:32.245007 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:32.273231 master-2 kubenswrapper[4762]: I1014 13:22:32.273166 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7dljg"] Oct 14 13:22:32.541422 master-2 kubenswrapper[4762]: I1014 13:22:32.541346 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-65bb9777fc-sd822" event={"ID":"09d92233-a8b3-458a-8c27-f62e982a9d90","Type":"ContainerStarted","Data":"4dc1f9176eae72ed9a8b5c6db938c5e7454cdbca3d2c2b88b7e8d3e28bfcb435"} Oct 14 13:22:32.542583 master-2 kubenswrapper[4762]: I1014 13:22:32.542548 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:22:32.544315 master-2 kubenswrapper[4762]: I1014 13:22:32.544275 4762 patch_prober.go:28] interesting pod/downloads-65bb9777fc-sd822 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" start-of-body= Oct 14 13:22:32.544452 master-2 kubenswrapper[4762]: I1014 13:22:32.544327 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-65bb9777fc-sd822" podUID="09d92233-a8b3-458a-8c27-f62e982a9d90" containerName="download-server" probeResult="failure" output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" Oct 14 13:22:32.545608 master-2 kubenswrapper[4762]: I1014 13:22:32.545574 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerStarted","Data":"840dc4498bd536250a7398a788a49331e3dac50e2646fa576882a945111e1a5a"} Oct 14 13:22:32.648604 master-2 kubenswrapper[4762]: I1014 13:22:32.648535 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:22:32.648923 master-2 kubenswrapper[4762]: E1014 13:22:32.648886 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="extract-content" Oct 14 13:22:32.648923 master-2 kubenswrapper[4762]: I1014 13:22:32.648911 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="extract-content" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: E1014 13:22:32.648930 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="registry-server" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: I1014 13:22:32.648943 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="registry-server" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: E1014 13:22:32.648963 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="multus-admission-controller" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: I1014 13:22:32.648978 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="multus-admission-controller" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: E1014 13:22:32.649002 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="extract-content" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: I1014 13:22:32.649013 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="extract-content" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: E1014 13:22:32.649027 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="registry-server" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: I1014 13:22:32.649040 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="registry-server" Oct 14 13:22:32.649055 master-2 kubenswrapper[4762]: E1014 13:22:32.649058 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649071 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649087 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" containerName="console" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649099 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" containerName="console" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649117 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649127 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649144 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="extract-content" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649178 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="extract-content" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649192 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="kube-rbac-proxy" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649202 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="kube-rbac-proxy" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649214 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649224 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="extract-utilities" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: E1014 13:22:32.649249 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="registry-server" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649260 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="registry-server" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649413 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" containerName="registry-server" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649431 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" containerName="console" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649452 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" containerName="registry-server" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649468 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="multus-admission-controller" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649488 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" containerName="registry-server" Oct 14 13:22:32.649604 master-2 kubenswrapper[4762]: I1014 13:22:32.649502 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" containerName="kube-rbac-proxy" Oct 14 13:22:32.650716 master-2 kubenswrapper[4762]: I1014 13:22:32.650412 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.653621 master-2 kubenswrapper[4762]: I1014 13:22:32.653564 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 14 13:22:32.653621 master-2 kubenswrapper[4762]: I1014 13:22:32.653581 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-r2r7j" Oct 14 13:22:32.658171 master-2 kubenswrapper[4762]: I1014 13:22:32.653846 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 14 13:22:32.658171 master-2 kubenswrapper[4762]: I1014 13:22:32.653941 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 14 13:22:32.658171 master-2 kubenswrapper[4762]: I1014 13:22:32.655527 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 14 13:22:32.658171 master-2 kubenswrapper[4762]: I1014 13:22:32.655848 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 14 13:22:32.665235 master-2 kubenswrapper[4762]: I1014 13:22:32.665193 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 14 13:22:32.695220 master-2 kubenswrapper[4762]: I1014 13:22:32.695119 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695431 master-2 kubenswrapper[4762]: I1014 13:22:32.695235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695431 master-2 kubenswrapper[4762]: I1014 13:22:32.695306 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695431 master-2 kubenswrapper[4762]: I1014 13:22:32.695351 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695431 master-2 kubenswrapper[4762]: I1014 13:22:32.695412 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695634 master-2 kubenswrapper[4762]: I1014 13:22:32.695450 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.695634 master-2 kubenswrapper[4762]: I1014 13:22:32.695483 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cd8x\" (UniqueName: \"kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797014 master-2 kubenswrapper[4762]: I1014 13:22:32.796886 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797014 master-2 kubenswrapper[4762]: I1014 13:22:32.796978 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797257 master-2 kubenswrapper[4762]: I1014 13:22:32.797044 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797257 master-2 kubenswrapper[4762]: I1014 13:22:32.797082 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797257 master-2 kubenswrapper[4762]: I1014 13:22:32.797117 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797257 master-2 kubenswrapper[4762]: I1014 13:22:32.797140 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.797257 master-2 kubenswrapper[4762]: I1014 13:22:32.797180 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cd8x\" (UniqueName: \"kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.798554 master-2 kubenswrapper[4762]: I1014 13:22:32.798517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.798685 master-2 kubenswrapper[4762]: I1014 13:22:32.798642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.799352 master-2 kubenswrapper[4762]: I1014 13:22:32.799302 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.799883 master-2 kubenswrapper[4762]: I1014 13:22:32.799819 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.802185 master-2 kubenswrapper[4762]: I1014 13:22:32.802120 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.803557 master-2 kubenswrapper[4762]: I1014 13:22:32.803486 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.827524 master-2 kubenswrapper[4762]: I1014 13:22:32.827470 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:22:32.844141 master-2 kubenswrapper[4762]: I1014 13:22:32.844064 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cd8x\" (UniqueName: \"kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x\") pod \"console-554dc689f9-c5k9h\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:32.972197 master-2 kubenswrapper[4762]: I1014 13:22:32.972059 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:33.542521 master-2 kubenswrapper[4762]: I1014 13:22:33.542455 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:22:33.553780 master-2 kubenswrapper[4762]: I1014 13:22:33.553722 4762 patch_prober.go:28] interesting pod/downloads-65bb9777fc-sd822 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" start-of-body= Oct 14 13:22:33.554013 master-2 kubenswrapper[4762]: I1014 13:22:33.553788 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-65bb9777fc-sd822" podUID="09d92233-a8b3-458a-8c27-f62e982a9d90" containerName="download-server" probeResult="failure" output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" Oct 14 13:22:33.559175 master-2 kubenswrapper[4762]: I1014 13:22:33.559097 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3" path="/var/lib/kubelet/pods/5c6d76f3-e8b9-4a4f-9c8e-28776b79c2a3/volumes" Oct 14 13:22:33.560034 master-2 kubenswrapper[4762]: I1014 13:22:33.559996 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0ff895-b205-40a6-9415-3160ea26c19c" path="/var/lib/kubelet/pods/cb0ff895-b205-40a6-9415-3160ea26c19c/volumes" Oct 14 13:22:33.561285 master-2 kubenswrapper[4762]: I1014 13:22:33.561237 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd256e17-f8ff-4f1d-a2b0-27168fc5262a" path="/var/lib/kubelet/pods/fd256e17-f8ff-4f1d-a2b0-27168fc5262a/volumes" Oct 14 13:22:34.069371 master-2 kubenswrapper[4762]: I1014 13:22:34.069287 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-7b6b7bb859-vrzvk"] Oct 14 13:22:34.076955 master-2 kubenswrapper[4762]: W1014 13:22:34.076857 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b40f213_fb3c_4692_ac1a_f6a8e3dd0f7d.slice/crio-68699cf50ddb6de45bd7c8683a7ec2d177cb6f35bbe1c743077e0c63066f5761 WatchSource:0}: Error finding container 68699cf50ddb6de45bd7c8683a7ec2d177cb6f35bbe1c743077e0c63066f5761: Status 404 returned error can't find the container with id 68699cf50ddb6de45bd7c8683a7ec2d177cb6f35bbe1c743077e0c63066f5761 Oct 14 13:22:34.081201 master-2 kubenswrapper[4762]: I1014 13:22:34.081088 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:22:34.563332 master-2 kubenswrapper[4762]: I1014 13:22:34.563214 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554dc689f9-c5k9h" event={"ID":"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d","Type":"ContainerStarted","Data":"9179376181c8cf0140daf02269d045ff57d8f106632284ef45b08ee2f82b226b"} Oct 14 13:22:34.563332 master-2 kubenswrapper[4762]: I1014 13:22:34.563287 4762 patch_prober.go:28] interesting pod/downloads-65bb9777fc-sd822 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" start-of-body= Oct 14 13:22:34.563332 master-2 kubenswrapper[4762]: I1014 13:22:34.563317 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554dc689f9-c5k9h" event={"ID":"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d","Type":"ContainerStarted","Data":"68699cf50ddb6de45bd7c8683a7ec2d177cb6f35bbe1c743077e0c63066f5761"} Oct 14 13:22:34.563332 master-2 kubenswrapper[4762]: I1014 13:22:34.563342 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-65bb9777fc-sd822" podUID="09d92233-a8b3-458a-8c27-f62e982a9d90" containerName="download-server" probeResult="failure" output="Get \"http://10.129.0.67:8080/\": dial tcp 10.129.0.67:8080: connect: connection refused" Oct 14 13:22:35.557850 master-2 kubenswrapper[4762]: I1014 13:22:35.557746 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19cccdb-ac9b-4919-85d8-d7ae33d2d003" path="/var/lib/kubelet/pods/e19cccdb-ac9b-4919-85d8-d7ae33d2d003/volumes" Oct 14 13:22:35.651468 master-2 kubenswrapper[4762]: I1014 13:22:35.651395 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:36.215726 master-2 kubenswrapper[4762]: I1014 13:22:36.215624 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-thpzb"] Oct 14 13:22:37.199598 master-2 kubenswrapper[4762]: I1014 13:22:37.199043 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hh4tw" podStartSLOduration=9.690958187 podStartE2EDuration="28.199013917s" podCreationTimestamp="2025-10-14 13:22:09 +0000 UTC" firstStartedPulling="2025-10-14 13:22:13.308214884 +0000 UTC m=+962.552374043" lastFinishedPulling="2025-10-14 13:22:31.816270574 +0000 UTC m=+981.060429773" observedRunningTime="2025-10-14 13:22:36.868921566 +0000 UTC m=+986.113080735" watchObservedRunningTime="2025-10-14 13:22:37.199013917 +0000 UTC m=+986.443173116" Oct 14 13:22:37.560922 master-2 kubenswrapper[4762]: I1014 13:22:37.560758 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20b78b5-af9a-4448-bb54-656bf15f46aa" path="/var/lib/kubelet/pods/b20b78b5-af9a-4448-bb54-656bf15f46aa/volumes" Oct 14 13:22:37.604334 master-2 kubenswrapper[4762]: I1014 13:22:37.604213 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-554dc689f9-c5k9h" podStartSLOduration=38.604185335 podStartE2EDuration="38.604185335s" podCreationTimestamp="2025-10-14 13:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:37.318929198 +0000 UTC m=+986.563088447" watchObservedRunningTime="2025-10-14 13:22:37.604185335 +0000 UTC m=+986.848344514" Oct 14 13:22:39.673269 master-2 kubenswrapper[4762]: I1014 13:22:39.673102 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:39.674031 master-2 kubenswrapper[4762]: I1014 13:22:39.673428 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:39.715818 master-2 kubenswrapper[4762]: I1014 13:22:39.715673 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:40.553564 master-2 kubenswrapper[4762]: I1014 13:22:40.553446 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-65bb9777fc-sd822" podStartSLOduration=9.687749041 podStartE2EDuration="49.553427327s" podCreationTimestamp="2025-10-14 13:21:51 +0000 UTC" firstStartedPulling="2025-10-14 13:21:52.184675397 +0000 UTC m=+941.428834596" lastFinishedPulling="2025-10-14 13:22:32.050353723 +0000 UTC m=+981.294512882" observedRunningTime="2025-10-14 13:22:38.482136117 +0000 UTC m=+987.726295286" watchObservedRunningTime="2025-10-14 13:22:40.553427327 +0000 UTC m=+989.797586486" Oct 14 13:22:40.663638 master-2 kubenswrapper[4762]: I1014 13:22:40.663569 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:41.772500 master-2 kubenswrapper[4762]: I1014 13:22:41.772448 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-65bb9777fc-sd822" Oct 14 13:22:41.779412 master-2 kubenswrapper[4762]: I1014 13:22:41.779349 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:41.895267 master-2 kubenswrapper[4762]: I1014 13:22:41.895042 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:22:42.976496 master-2 kubenswrapper[4762]: I1014 13:22:42.976262 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:42.976496 master-2 kubenswrapper[4762]: I1014 13:22:42.976329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:42.977102 master-2 kubenswrapper[4762]: I1014 13:22:42.976582 4762 patch_prober.go:28] interesting pod/console-554dc689f9-c5k9h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.129.0.78:8443/health\": dial tcp 10.129.0.78:8443: connect: connection refused" start-of-body= Oct 14 13:22:42.977102 master-2 kubenswrapper[4762]: I1014 13:22:42.976637 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-554dc689f9-c5k9h" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerName="console" probeResult="failure" output="Get \"https://10.129.0.78:8443/health\": dial tcp 10.129.0.78:8443: connect: connection refused" Oct 14 13:22:43.630809 master-2 kubenswrapper[4762]: I1014 13:22:43.630686 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hh4tw" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="registry-server" containerID="cri-o://840dc4498bd536250a7398a788a49331e3dac50e2646fa576882a945111e1a5a" gracePeriod=2 Oct 14 13:22:44.641372 master-2 kubenswrapper[4762]: I1014 13:22:44.641277 4762 generic.go:334] "Generic (PLEG): container finished" podID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerID="840dc4498bd536250a7398a788a49331e3dac50e2646fa576882a945111e1a5a" exitCode=0 Oct 14 13:22:44.641372 master-2 kubenswrapper[4762]: I1014 13:22:44.641366 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerDied","Data":"840dc4498bd536250a7398a788a49331e3dac50e2646fa576882a945111e1a5a"} Oct 14 13:22:45.686919 master-2 kubenswrapper[4762]: I1014 13:22:45.686846 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:45.815865 master-2 kubenswrapper[4762]: I1014 13:22:45.815792 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content\") pod \"694a78d6-ba6f-411d-a608-ef49cf924ce0\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " Oct 14 13:22:45.815865 master-2 kubenswrapper[4762]: I1014 13:22:45.815863 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsgt\" (UniqueName: \"kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt\") pod \"694a78d6-ba6f-411d-a608-ef49cf924ce0\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " Oct 14 13:22:45.816172 master-2 kubenswrapper[4762]: I1014 13:22:45.815983 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities\") pod \"694a78d6-ba6f-411d-a608-ef49cf924ce0\" (UID: \"694a78d6-ba6f-411d-a608-ef49cf924ce0\") " Oct 14 13:22:45.817051 master-2 kubenswrapper[4762]: I1014 13:22:45.816963 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities" (OuterVolumeSpecName: "utilities") pod "694a78d6-ba6f-411d-a608-ef49cf924ce0" (UID: "694a78d6-ba6f-411d-a608-ef49cf924ce0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:45.818814 master-2 kubenswrapper[4762]: I1014 13:22:45.818760 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt" (OuterVolumeSpecName: "kube-api-access-thsgt") pod "694a78d6-ba6f-411d-a608-ef49cf924ce0" (UID: "694a78d6-ba6f-411d-a608-ef49cf924ce0"). InnerVolumeSpecName "kube-api-access-thsgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:45.917831 master-2 kubenswrapper[4762]: I1014 13:22:45.917795 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:45.917831 master-2 kubenswrapper[4762]: I1014 13:22:45.917827 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsgt\" (UniqueName: \"kubernetes.io/projected/694a78d6-ba6f-411d-a608-ef49cf924ce0-kube-api-access-thsgt\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:45.919780 master-2 kubenswrapper[4762]: I1014 13:22:45.919745 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "694a78d6-ba6f-411d-a608-ef49cf924ce0" (UID: "694a78d6-ba6f-411d-a608-ef49cf924ce0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:22:46.020586 master-2 kubenswrapper[4762]: I1014 13:22:46.020487 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694a78d6-ba6f-411d-a608-ef49cf924ce0-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:46.657029 master-2 kubenswrapper[4762]: I1014 13:22:46.656935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hh4tw" event={"ID":"694a78d6-ba6f-411d-a608-ef49cf924ce0","Type":"ContainerDied","Data":"de750e5e34687f467bae486a4deddde1d5dd7e7c825f92b236f7aff633e20c0f"} Oct 14 13:22:46.657029 master-2 kubenswrapper[4762]: I1014 13:22:46.657004 4762 scope.go:117] "RemoveContainer" containerID="840dc4498bd536250a7398a788a49331e3dac50e2646fa576882a945111e1a5a" Oct 14 13:22:46.657029 master-2 kubenswrapper[4762]: I1014 13:22:46.657021 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hh4tw" Oct 14 13:22:46.677646 master-2 kubenswrapper[4762]: I1014 13:22:46.677592 4762 scope.go:117] "RemoveContainer" containerID="d8f070111070b462f8f13f371a942eca727e9566df5b02713259d59ac85ce22f" Oct 14 13:22:46.697412 master-2 kubenswrapper[4762]: I1014 13:22:46.697355 4762 scope.go:117] "RemoveContainer" containerID="64be655332c62ebc2e7ad8d40a87c2869ad8ff9bf10d5edf62e6e8ad808f69cd" Oct 14 13:22:51.373819 master-2 kubenswrapper[4762]: I1014 13:22:51.373687 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:52.377816 master-2 kubenswrapper[4762]: I1014 13:22:52.377714 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hh4tw"] Oct 14 13:22:52.979382 master-2 kubenswrapper[4762]: I1014 13:22:52.979326 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:52.983286 master-2 kubenswrapper[4762]: I1014 13:22:52.983255 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:22:53.558742 master-2 kubenswrapper[4762]: I1014 13:22:53.558683 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" path="/var/lib/kubelet/pods/694a78d6-ba6f-411d-a608-ef49cf924ce0/volumes" Oct 14 13:22:56.513387 master-2 kubenswrapper[4762]: I1014 13:22:56.513283 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:22:56.514263 master-2 kubenswrapper[4762]: I1014 13:22:56.513588 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerName="controller-manager" containerID="cri-o://9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1" gracePeriod=30 Oct 14 13:22:56.669731 master-2 kubenswrapper[4762]: I1014 13:22:56.669632 4762 patch_prober.go:28] interesting pod/controller-manager-66975b7c4d-j962d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.129.0.59:8443/healthz\": dial tcp 10.129.0.59:8443: connect: connection refused" start-of-body= Oct 14 13:22:56.669731 master-2 kubenswrapper[4762]: I1014 13:22:56.669717 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.129.0.59:8443/healthz\": dial tcp 10.129.0.59:8443: connect: connection refused" Oct 14 13:22:57.651497 master-2 kubenswrapper[4762]: I1014 13:22:57.651434 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:22:57.732111 master-2 kubenswrapper[4762]: I1014 13:22:57.732020 4762 generic.go:334] "Generic (PLEG): container finished" podID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerID="9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1" exitCode=0 Oct 14 13:22:57.732111 master-2 kubenswrapper[4762]: I1014 13:22:57.732082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" event={"ID":"058b0ff2-1e70-4446-a498-f94548dfb60f","Type":"ContainerDied","Data":"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1"} Oct 14 13:22:57.732554 master-2 kubenswrapper[4762]: I1014 13:22:57.732139 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" Oct 14 13:22:57.732554 master-2 kubenswrapper[4762]: I1014 13:22:57.732214 4762 scope.go:117] "RemoveContainer" containerID="9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1" Oct 14 13:22:57.732554 master-2 kubenswrapper[4762]: I1014 13:22:57.732187 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66975b7c4d-j962d" event={"ID":"058b0ff2-1e70-4446-a498-f94548dfb60f","Type":"ContainerDied","Data":"93aeb62040e784adfc31664e25e9fc5f0b5e71bef30624b7ca84a50c711033d8"} Oct 14 13:22:57.750550 master-2 kubenswrapper[4762]: I1014 13:22:57.750470 4762 scope.go:117] "RemoveContainer" containerID="9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1" Oct 14 13:22:57.751123 master-2 kubenswrapper[4762]: E1014 13:22:57.751052 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1\": container with ID starting with 9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1 not found: ID does not exist" containerID="9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1" Oct 14 13:22:57.751355 master-2 kubenswrapper[4762]: I1014 13:22:57.751131 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1"} err="failed to get container status \"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1\": rpc error: code = NotFound desc = could not find container \"9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1\": container with ID starting with 9bab3e536969570ee72e6bf05efc6ecd02915cea57317888a08acff985a4a3c1 not found: ID does not exist" Oct 14 13:22:57.793747 master-2 kubenswrapper[4762]: I1014 13:22:57.793702 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config\") pod \"058b0ff2-1e70-4446-a498-f94548dfb60f\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " Oct 14 13:22:57.793976 master-2 kubenswrapper[4762]: I1014 13:22:57.793950 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca\") pod \"058b0ff2-1e70-4446-a498-f94548dfb60f\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " Oct 14 13:22:57.794361 master-2 kubenswrapper[4762]: I1014 13:22:57.794328 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjvk\" (UniqueName: \"kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk\") pod \"058b0ff2-1e70-4446-a498-f94548dfb60f\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " Oct 14 13:22:57.794569 master-2 kubenswrapper[4762]: I1014 13:22:57.794543 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles\") pod \"058b0ff2-1e70-4446-a498-f94548dfb60f\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " Oct 14 13:22:57.794789 master-2 kubenswrapper[4762]: I1014 13:22:57.794765 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert\") pod \"058b0ff2-1e70-4446-a498-f94548dfb60f\" (UID: \"058b0ff2-1e70-4446-a498-f94548dfb60f\") " Oct 14 13:22:57.795374 master-2 kubenswrapper[4762]: I1014 13:22:57.795299 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "058b0ff2-1e70-4446-a498-f94548dfb60f" (UID: "058b0ff2-1e70-4446-a498-f94548dfb60f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:57.795508 master-2 kubenswrapper[4762]: I1014 13:22:57.794953 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config" (OuterVolumeSpecName: "config") pod "058b0ff2-1e70-4446-a498-f94548dfb60f" (UID: "058b0ff2-1e70-4446-a498-f94548dfb60f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:57.795919 master-2 kubenswrapper[4762]: I1014 13:22:57.795575 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca" (OuterVolumeSpecName: "client-ca") pod "058b0ff2-1e70-4446-a498-f94548dfb60f" (UID: "058b0ff2-1e70-4446-a498-f94548dfb60f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:57.799038 master-2 kubenswrapper[4762]: I1014 13:22:57.798965 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk" (OuterVolumeSpecName: "kube-api-access-ffjvk") pod "058b0ff2-1e70-4446-a498-f94548dfb60f" (UID: "058b0ff2-1e70-4446-a498-f94548dfb60f"). InnerVolumeSpecName "kube-api-access-ffjvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:57.799384 master-2 kubenswrapper[4762]: I1014 13:22:57.799323 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "058b0ff2-1e70-4446-a498-f94548dfb60f" (UID: "058b0ff2-1e70-4446-a498-f94548dfb60f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:57.857002 master-2 kubenswrapper[4762]: I1014 13:22:57.856872 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:22:57.857233 master-2 kubenswrapper[4762]: I1014 13:22:57.857110 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" podUID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" containerName="route-controller-manager" containerID="cri-o://31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9" gracePeriod=30 Oct 14 13:22:57.904074 master-2 kubenswrapper[4762]: I1014 13:22:57.903080 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:57.904074 master-2 kubenswrapper[4762]: I1014 13:22:57.903131 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjvk\" (UniqueName: \"kubernetes.io/projected/058b0ff2-1e70-4446-a498-f94548dfb60f-kube-api-access-ffjvk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:57.904074 master-2 kubenswrapper[4762]: I1014 13:22:57.903170 4762 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-proxy-ca-bundles\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:57.904074 master-2 kubenswrapper[4762]: I1014 13:22:57.903223 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058b0ff2-1e70-4446-a498-f94548dfb60f-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:57.904074 master-2 kubenswrapper[4762]: I1014 13:22:57.903242 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058b0ff2-1e70-4446-a498-f94548dfb60f-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:58.079597 master-2 kubenswrapper[4762]: I1014 13:22:58.079542 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:22:58.092622 master-2 kubenswrapper[4762]: I1014 13:22:58.092547 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66975b7c4d-j962d"] Oct 14 13:22:58.288289 master-2 kubenswrapper[4762]: I1014 13:22:58.288254 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:22:58.307746 master-2 kubenswrapper[4762]: I1014 13:22:58.307677 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca\") pod \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " Oct 14 13:22:58.307936 master-2 kubenswrapper[4762]: I1014 13:22:58.307806 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert\") pod \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " Oct 14 13:22:58.307936 master-2 kubenswrapper[4762]: I1014 13:22:58.307873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xk5d\" (UniqueName: \"kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d\") pod \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " Oct 14 13:22:58.308014 master-2 kubenswrapper[4762]: I1014 13:22:58.307959 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config\") pod \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\" (UID: \"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7\") " Oct 14 13:22:58.308413 master-2 kubenswrapper[4762]: I1014 13:22:58.308350 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca" (OuterVolumeSpecName: "client-ca") pod "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" (UID: "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:58.310777 master-2 kubenswrapper[4762]: I1014 13:22:58.310725 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config" (OuterVolumeSpecName: "config") pod "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" (UID: "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:22:58.312395 master-2 kubenswrapper[4762]: I1014 13:22:58.312355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d" (OuterVolumeSpecName: "kube-api-access-5xk5d") pod "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" (UID: "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7"). InnerVolumeSpecName "kube-api-access-5xk5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:22:58.313578 master-2 kubenswrapper[4762]: I1014 13:22:58.313533 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" (UID: "75e2cdd2-4fc7-42c9-999c-5f9e50010bc7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:22:58.359782 master-2 kubenswrapper[4762]: I1014 13:22:58.359683 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv"] Oct 14 13:22:58.360052 master-2 kubenswrapper[4762]: E1014 13:22:58.360002 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="extract-utilities" Oct 14 13:22:58.360052 master-2 kubenswrapper[4762]: I1014 13:22:58.360024 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="extract-utilities" Oct 14 13:22:58.360052 master-2 kubenswrapper[4762]: E1014 13:22:58.360035 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="extract-content" Oct 14 13:22:58.360052 master-2 kubenswrapper[4762]: I1014 13:22:58.360041 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="extract-content" Oct 14 13:22:58.360052 master-2 kubenswrapper[4762]: E1014 13:22:58.360053 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerName="controller-manager" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360061 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerName="controller-manager" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: E1014 13:22:58.360078 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" containerName="route-controller-manager" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360085 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" containerName="route-controller-manager" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: E1014 13:22:58.360096 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="registry-server" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360101 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="registry-server" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360219 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" containerName="route-controller-manager" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360233 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="694a78d6-ba6f-411d-a608-ef49cf924ce0" containerName="registry-server" Oct 14 13:22:58.360702 master-2 kubenswrapper[4762]: I1014 13:22:58.360244 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" containerName="controller-manager" Oct 14 13:22:58.361002 master-2 kubenswrapper[4762]: I1014 13:22:58.360865 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.363623 master-2 kubenswrapper[4762]: I1014 13:22:58.363590 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 14 13:22:58.364102 master-2 kubenswrapper[4762]: I1014 13:22:58.364083 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 14 13:22:58.364340 master-2 kubenswrapper[4762]: I1014 13:22:58.364293 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 14 13:22:58.364463 master-2 kubenswrapper[4762]: I1014 13:22:58.364421 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 14 13:22:58.364530 master-2 kubenswrapper[4762]: I1014 13:22:58.364373 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-2zbrt" Oct 14 13:22:58.364603 master-2 kubenswrapper[4762]: I1014 13:22:58.364441 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 14 13:22:58.374792 master-2 kubenswrapper[4762]: I1014 13:22:58.374739 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv"] Oct 14 13:22:58.377706 master-2 kubenswrapper[4762]: I1014 13:22:58.377673 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 14 13:22:58.409427 master-2 kubenswrapper[4762]: I1014 13:22:58.409374 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-proxy-ca-bundles\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.409427 master-2 kubenswrapper[4762]: I1014 13:22:58.409433 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-config\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.409653 master-2 kubenswrapper[4762]: I1014 13:22:58.409543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-serving-cert\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.409653 master-2 kubenswrapper[4762]: I1014 13:22:58.409579 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2r5\" (UniqueName: \"kubernetes.io/projected/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-kube-api-access-rr2r5\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.409722 master-2 kubenswrapper[4762]: I1014 13:22:58.409676 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-client-ca\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.409755 master-2 kubenswrapper[4762]: I1014 13:22:58.409727 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:58.409755 master-2 kubenswrapper[4762]: I1014 13:22:58.409738 4762 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:58.409755 master-2 kubenswrapper[4762]: I1014 13:22:58.409752 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:58.409833 master-2 kubenswrapper[4762]: I1014 13:22:58.409768 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xk5d\" (UniqueName: \"kubernetes.io/projected/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7-kube-api-access-5xk5d\") on node \"master-2\" DevicePath \"\"" Oct 14 13:22:58.511588 master-2 kubenswrapper[4762]: I1014 13:22:58.511404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-serving-cert\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.511588 master-2 kubenswrapper[4762]: I1014 13:22:58.511488 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2r5\" (UniqueName: \"kubernetes.io/projected/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-kube-api-access-rr2r5\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.511588 master-2 kubenswrapper[4762]: I1014 13:22:58.511566 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-client-ca\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.511977 master-2 kubenswrapper[4762]: I1014 13:22:58.511639 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-proxy-ca-bundles\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.511977 master-2 kubenswrapper[4762]: I1014 13:22:58.511696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-config\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.513262 master-2 kubenswrapper[4762]: I1014 13:22:58.513204 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-client-ca\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.514376 master-2 kubenswrapper[4762]: I1014 13:22:58.514307 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-proxy-ca-bundles\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.514524 master-2 kubenswrapper[4762]: I1014 13:22:58.514396 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-config\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.517144 master-2 kubenswrapper[4762]: I1014 13:22:58.517035 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-serving-cert\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.546451 master-2 kubenswrapper[4762]: I1014 13:22:58.546338 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2r5\" (UniqueName: \"kubernetes.io/projected/e72dccde-884b-4370-8c9f-8f5d87e0e7e3-kube-api-access-rr2r5\") pod \"controller-manager-78c5d9fccd-pr9sv\" (UID: \"e72dccde-884b-4370-8c9f-8f5d87e0e7e3\") " pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.684383 master-2 kubenswrapper[4762]: I1014 13:22:58.684299 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:58.741124 master-2 kubenswrapper[4762]: I1014 13:22:58.741042 4762 generic.go:334] "Generic (PLEG): container finished" podID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" containerID="31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9" exitCode=0 Oct 14 13:22:58.741124 master-2 kubenswrapper[4762]: I1014 13:22:58.741133 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" event={"ID":"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7","Type":"ContainerDied","Data":"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9"} Oct 14 13:22:58.741431 master-2 kubenswrapper[4762]: I1014 13:22:58.741192 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" event={"ID":"75e2cdd2-4fc7-42c9-999c-5f9e50010bc7","Type":"ContainerDied","Data":"ce42342ee02c3e802cfa620e22cf0e6bff4cada65294cfaf4906619687daec04"} Oct 14 13:22:58.741431 master-2 kubenswrapper[4762]: I1014 13:22:58.741222 4762 scope.go:117] "RemoveContainer" containerID="31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9" Oct 14 13:22:58.741431 master-2 kubenswrapper[4762]: I1014 13:22:58.741355 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml" Oct 14 13:22:58.760743 master-2 kubenswrapper[4762]: I1014 13:22:58.760675 4762 scope.go:117] "RemoveContainer" containerID="31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9" Oct 14 13:22:58.762397 master-2 kubenswrapper[4762]: E1014 13:22:58.762248 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9\": container with ID starting with 31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9 not found: ID does not exist" containerID="31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9" Oct 14 13:22:58.762397 master-2 kubenswrapper[4762]: I1014 13:22:58.762307 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9"} err="failed to get container status \"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9\": rpc error: code = NotFound desc = could not find container \"31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9\": container with ID starting with 31b56d0e37db7c135741716819f2bf077d732605950f8f5192dd60e5eecaccb9 not found: ID does not exist" Oct 14 13:22:58.796203 master-2 kubenswrapper[4762]: I1014 13:22:58.793142 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:22:58.800854 master-2 kubenswrapper[4762]: I1014 13:22:58.797850 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f4d8cd68-t98ml"] Oct 14 13:22:59.120935 master-2 kubenswrapper[4762]: I1014 13:22:59.120872 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv"] Oct 14 13:22:59.350279 master-2 kubenswrapper[4762]: I1014 13:22:59.350221 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp"] Oct 14 13:22:59.351928 master-2 kubenswrapper[4762]: I1014 13:22:59.351902 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.354662 master-2 kubenswrapper[4762]: I1014 13:22:59.354632 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-t6l59" Oct 14 13:22:59.354818 master-2 kubenswrapper[4762]: I1014 13:22:59.354633 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 14 13:22:59.354938 master-2 kubenswrapper[4762]: I1014 13:22:59.354710 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 14 13:22:59.355012 master-2 kubenswrapper[4762]: I1014 13:22:59.354983 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 14 13:22:59.355488 master-2 kubenswrapper[4762]: I1014 13:22:59.355456 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 14 13:22:59.355553 master-2 kubenswrapper[4762]: I1014 13:22:59.355411 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 14 13:22:59.366503 master-2 kubenswrapper[4762]: I1014 13:22:59.366459 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp"] Oct 14 13:22:59.522273 master-2 kubenswrapper[4762]: I1014 13:22:59.522193 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-config\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.522273 master-2 kubenswrapper[4762]: I1014 13:22:59.522241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ac972d-869f-481b-8f5b-2891c3e9bdd3-serving-cert\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.522273 master-2 kubenswrapper[4762]: I1014 13:22:59.522269 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6wt4\" (UniqueName: \"kubernetes.io/projected/45ac972d-869f-481b-8f5b-2891c3e9bdd3-kube-api-access-b6wt4\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.522808 master-2 kubenswrapper[4762]: I1014 13:22:59.522377 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-client-ca\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.561943 master-2 kubenswrapper[4762]: I1014 13:22:59.561867 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058b0ff2-1e70-4446-a498-f94548dfb60f" path="/var/lib/kubelet/pods/058b0ff2-1e70-4446-a498-f94548dfb60f/volumes" Oct 14 13:22:59.562987 master-2 kubenswrapper[4762]: I1014 13:22:59.562938 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e2cdd2-4fc7-42c9-999c-5f9e50010bc7" path="/var/lib/kubelet/pods/75e2cdd2-4fc7-42c9-999c-5f9e50010bc7/volumes" Oct 14 13:22:59.624114 master-2 kubenswrapper[4762]: I1014 13:22:59.623920 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-client-ca\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.624114 master-2 kubenswrapper[4762]: I1014 13:22:59.624101 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-config\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.625527 master-2 kubenswrapper[4762]: I1014 13:22:59.625128 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ac972d-869f-481b-8f5b-2891c3e9bdd3-serving-cert\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.625527 master-2 kubenswrapper[4762]: I1014 13:22:59.625211 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6wt4\" (UniqueName: \"kubernetes.io/projected/45ac972d-869f-481b-8f5b-2891c3e9bdd3-kube-api-access-b6wt4\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.626011 master-2 kubenswrapper[4762]: I1014 13:22:59.625777 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-client-ca\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.626695 master-2 kubenswrapper[4762]: I1014 13:22:59.626628 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ac972d-869f-481b-8f5b-2891c3e9bdd3-config\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.632214 master-2 kubenswrapper[4762]: I1014 13:22:59.632126 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ac972d-869f-481b-8f5b-2891c3e9bdd3-serving-cert\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.653086 master-2 kubenswrapper[4762]: I1014 13:22:59.653024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6wt4\" (UniqueName: \"kubernetes.io/projected/45ac972d-869f-481b-8f5b-2891c3e9bdd3-kube-api-access-b6wt4\") pod \"route-controller-manager-7968c6c999-b54xp\" (UID: \"45ac972d-869f-481b-8f5b-2891c3e9bdd3\") " pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.679571 master-2 kubenswrapper[4762]: I1014 13:22:59.679502 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:22:59.758215 master-2 kubenswrapper[4762]: I1014 13:22:59.758110 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" event={"ID":"e72dccde-884b-4370-8c9f-8f5d87e0e7e3","Type":"ContainerStarted","Data":"def6bed8d74f5fa3239fc7089c0efdfcac92d9c122afe842d4fcabffc3cd5543"} Oct 14 13:22:59.758594 master-2 kubenswrapper[4762]: I1014 13:22:59.758235 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" event={"ID":"e72dccde-884b-4370-8c9f-8f5d87e0e7e3","Type":"ContainerStarted","Data":"0e95cb34eb64929c81c46621d70506640822c9ecfe15b8634797291d52e5d5fa"} Oct 14 13:22:59.758686 master-2 kubenswrapper[4762]: I1014 13:22:59.758654 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:59.766567 master-2 kubenswrapper[4762]: I1014 13:22:59.764768 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" Oct 14 13:22:59.802356 master-2 kubenswrapper[4762]: I1014 13:22:59.802214 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c5d9fccd-pr9sv" podStartSLOduration=3.802184361 podStartE2EDuration="3.802184361s" podCreationTimestamp="2025-10-14 13:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:22:59.795647584 +0000 UTC m=+1009.039806783" watchObservedRunningTime="2025-10-14 13:22:59.802184361 +0000 UTC m=+1009.046343520" Oct 14 13:23:00.134105 master-2 kubenswrapper[4762]: I1014 13:23:00.134024 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp"] Oct 14 13:23:00.139500 master-2 kubenswrapper[4762]: W1014 13:23:00.139450 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ac972d_869f_481b_8f5b_2891c3e9bdd3.slice/crio-e7cfa5448e7c0e7e4d8329ca3da5d363e4555266e4e214613da53bc0085d16a1 WatchSource:0}: Error finding container e7cfa5448e7c0e7e4d8329ca3da5d363e4555266e4e214613da53bc0085d16a1: Status 404 returned error can't find the container with id e7cfa5448e7c0e7e4d8329ca3da5d363e4555266e4e214613da53bc0085d16a1 Oct 14 13:23:00.768551 master-2 kubenswrapper[4762]: I1014 13:23:00.768470 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" event={"ID":"45ac972d-869f-481b-8f5b-2891c3e9bdd3","Type":"ContainerStarted","Data":"d054360adbea63e7ecb3fe9805619bee7d8bb9503883e8516d6e010e2fc8f419"} Oct 14 13:23:00.768551 master-2 kubenswrapper[4762]: I1014 13:23:00.768555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" event={"ID":"45ac972d-869f-481b-8f5b-2891c3e9bdd3","Type":"ContainerStarted","Data":"e7cfa5448e7c0e7e4d8329ca3da5d363e4555266e4e214613da53bc0085d16a1"} Oct 14 13:23:00.818293 master-2 kubenswrapper[4762]: I1014 13:23:00.818118 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" podStartSLOduration=3.818090859 podStartE2EDuration="3.818090859s" podCreationTimestamp="2025-10-14 13:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:23:00.817306624 +0000 UTC m=+1010.061465833" watchObservedRunningTime="2025-10-14 13:23:00.818090859 +0000 UTC m=+1010.062250038" Oct 14 13:23:01.776438 master-2 kubenswrapper[4762]: I1014 13:23:01.776382 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:23:01.787770 master-2 kubenswrapper[4762]: I1014 13:23:01.787692 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7968c6c999-b54xp" Oct 14 13:23:05.697410 master-2 kubenswrapper[4762]: I1014 13:23:05.697335 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:23:05.737637 master-2 kubenswrapper[4762]: I1014 13:23:05.737543 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:23:05.841808 master-2 kubenswrapper[4762]: I1014 13:23:05.841743 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:23:06.918180 master-2 kubenswrapper[4762]: I1014 13:23:06.918049 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" podUID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" containerName="oauth-openshift" containerID="cri-o://63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede" gracePeriod=15 Oct 14 13:23:07.322746 master-2 kubenswrapper[4762]: I1014 13:23:07.322684 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:23:07.340422 master-2 kubenswrapper[4762]: I1014 13:23:07.340328 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340422 master-2 kubenswrapper[4762]: I1014 13:23:07.340403 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340464 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340501 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340531 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340563 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340660 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340693 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.340738 master-2 kubenswrapper[4762]: I1014 13:23:07.340702 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:23:07.341136 master-2 kubenswrapper[4762]: I1014 13:23:07.340720 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.341136 master-2 kubenswrapper[4762]: I1014 13:23:07.340849 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.341136 master-2 kubenswrapper[4762]: I1014 13:23:07.340917 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.341136 master-2 kubenswrapper[4762]: I1014 13:23:07.340958 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqq6\" (UniqueName: \"kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6\") pod \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\" (UID: \"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e\") " Oct 14 13:23:07.341136 master-2 kubenswrapper[4762]: I1014 13:23:07.341090 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:07.341440 master-2 kubenswrapper[4762]: I1014 13:23:07.341331 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.341440 master-2 kubenswrapper[4762]: I1014 13:23:07.341355 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-audit-policies\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.341542 master-2 kubenswrapper[4762]: I1014 13:23:07.341485 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:07.341600 master-2 kubenswrapper[4762]: I1014 13:23:07.341532 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:07.341650 master-2 kubenswrapper[4762]: I1014 13:23:07.341589 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:07.344774 master-2 kubenswrapper[4762]: I1014 13:23:07.344725 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.347715 master-2 kubenswrapper[4762]: I1014 13:23:07.346998 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.347715 master-2 kubenswrapper[4762]: I1014 13:23:07.347084 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6" (OuterVolumeSpecName: "kube-api-access-6qqq6") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "kube-api-access-6qqq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:23:07.349127 master-2 kubenswrapper[4762]: I1014 13:23:07.349055 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.349188 master-2 kubenswrapper[4762]: I1014 13:23:07.349126 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.349645 master-2 kubenswrapper[4762]: I1014 13:23:07.349566 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.349876 master-2 kubenswrapper[4762]: I1014 13:23:07.349834 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.350469 master-2 kubenswrapper[4762]: I1014 13:23:07.350436 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" (UID: "fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442418 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-cliconfig\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442454 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442465 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-error\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442475 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-provider-selection\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442485 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-router-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442478 master-2 kubenswrapper[4762]: I1014 13:23:07.442496 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqq6\" (UniqueName: \"kubernetes.io/projected/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-kube-api-access-6qqq6\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442841 master-2 kubenswrapper[4762]: I1014 13:23:07.442506 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442841 master-2 kubenswrapper[4762]: I1014 13:23:07.442516 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-user-template-login\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442841 master-2 kubenswrapper[4762]: I1014 13:23:07.442525 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-session\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442841 master-2 kubenswrapper[4762]: I1014 13:23:07.442535 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-service-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.442841 master-2 kubenswrapper[4762]: I1014 13:23:07.442549 4762 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e-v4-0-config-system-ocp-branding-template\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:07.822575 master-2 kubenswrapper[4762]: I1014 13:23:07.822499 4762 generic.go:334] "Generic (PLEG): container finished" podID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" containerID="63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede" exitCode=0 Oct 14 13:23:07.822575 master-2 kubenswrapper[4762]: I1014 13:23:07.822550 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" event={"ID":"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e","Type":"ContainerDied","Data":"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede"} Oct 14 13:23:07.822575 master-2 kubenswrapper[4762]: I1014 13:23:07.822578 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" event={"ID":"fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e","Type":"ContainerDied","Data":"406aef131c9b7b329fc3a9d41d454194f2f968d298f15a5fceaa019e9656f036"} Oct 14 13:23:07.822886 master-2 kubenswrapper[4762]: I1014 13:23:07.822596 4762 scope.go:117] "RemoveContainer" containerID="63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede" Oct 14 13:23:07.822886 master-2 kubenswrapper[4762]: I1014 13:23:07.822799 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4" Oct 14 13:23:07.852331 master-2 kubenswrapper[4762]: I1014 13:23:07.852195 4762 scope.go:117] "RemoveContainer" containerID="63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede" Oct 14 13:23:07.852927 master-2 kubenswrapper[4762]: E1014 13:23:07.852875 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede\": container with ID starting with 63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede not found: ID does not exist" containerID="63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede" Oct 14 13:23:07.853014 master-2 kubenswrapper[4762]: I1014 13:23:07.852928 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede"} err="failed to get container status \"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede\": rpc error: code = NotFound desc = could not find container \"63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede\": container with ID starting with 63c5eff45b98f1926f070d3bd3c511c51402b0bf8e00b7fa3d2ed2cd7a12fede not found: ID does not exist" Oct 14 13:23:07.864495 master-2 kubenswrapper[4762]: I1014 13:23:07.863529 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:23:07.867821 master-2 kubenswrapper[4762]: I1014 13:23:07.867625 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-55df5b4c9d-k6sz4"] Oct 14 13:23:09.564006 master-2 kubenswrapper[4762]: I1014 13:23:09.563937 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" path="/var/lib/kubelet/pods/fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e/volumes" Oct 14 13:23:16.398580 master-2 kubenswrapper[4762]: I1014 13:23:16.398465 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65687bc9c8-twgxt"] Oct 14 13:23:16.399412 master-2 kubenswrapper[4762]: E1014 13:23:16.398783 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" containerName="oauth-openshift" Oct 14 13:23:16.399412 master-2 kubenswrapper[4762]: I1014 13:23:16.398800 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" containerName="oauth-openshift" Oct 14 13:23:16.399412 master-2 kubenswrapper[4762]: I1014 13:23:16.398944 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfcfc2b-8cf3-41c7-a3ca-7482998d0e0e" containerName="oauth-openshift" Oct 14 13:23:16.399614 master-2 kubenswrapper[4762]: I1014 13:23:16.399548 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.405081 master-2 kubenswrapper[4762]: I1014 13:23:16.405042 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 14 13:23:16.405081 master-2 kubenswrapper[4762]: I1014 13:23:16.405062 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 14 13:23:16.405975 master-2 kubenswrapper[4762]: I1014 13:23:16.405317 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-5vqgl" Oct 14 13:23:16.405975 master-2 kubenswrapper[4762]: I1014 13:23:16.405346 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 14 13:23:16.405975 master-2 kubenswrapper[4762]: I1014 13:23:16.405725 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 14 13:23:16.406141 master-2 kubenswrapper[4762]: I1014 13:23:16.406051 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 14 13:23:16.406252 master-2 kubenswrapper[4762]: I1014 13:23:16.406203 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 14 13:23:16.406582 master-2 kubenswrapper[4762]: I1014 13:23:16.406513 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 14 13:23:16.406647 master-2 kubenswrapper[4762]: I1014 13:23:16.406611 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 14 13:23:16.406785 master-2 kubenswrapper[4762]: I1014 13:23:16.406657 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 14 13:23:16.406942 master-2 kubenswrapper[4762]: I1014 13:23:16.406896 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 14 13:23:16.409357 master-2 kubenswrapper[4762]: I1014 13:23:16.407381 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 14 13:23:16.416486 master-2 kubenswrapper[4762]: I1014 13:23:16.416433 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 14 13:23:16.429461 master-2 kubenswrapper[4762]: I1014 13:23:16.429402 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 14 13:23:16.432936 master-2 kubenswrapper[4762]: I1014 13:23:16.432890 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65687bc9c8-twgxt"] Oct 14 13:23:16.466501 master-2 kubenswrapper[4762]: I1014 13:23:16.466446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-error\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.466501 master-2 kubenswrapper[4762]: I1014 13:23:16.466504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-login\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.466805 master-2 kubenswrapper[4762]: I1014 13:23:16.466564 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.466805 master-2 kubenswrapper[4762]: I1014 13:23:16.466592 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.466805 master-2 kubenswrapper[4762]: I1014 13:23:16.466646 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-dir\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.466805 master-2 kubenswrapper[4762]: I1014 13:23:16.466751 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-router-certs\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.466809 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.466832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d27bc\" (UniqueName: \"kubernetes.io/projected/6a9723af-f63a-45cd-9456-b4d67c4d778a-kube-api-access-d27bc\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.466876 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-policies\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.466903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.466957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-service-ca\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.467003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-session\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.467055 master-2 kubenswrapper[4762]: I1014 13:23:16.467032 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.568598 master-2 kubenswrapper[4762]: I1014 13:23:16.568538 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-policies\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.568598 master-2 kubenswrapper[4762]: I1014 13:23:16.568597 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568625 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-service-ca\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-session\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568679 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568719 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-error\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568746 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-login\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568783 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568831 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-dir\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.568957 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-router-certs\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.569015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d27bc\" (UniqueName: \"kubernetes.io/projected/6a9723af-f63a-45cd-9456-b4d67c4d778a-kube-api-access-d27bc\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.569032 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-dir\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.569767 master-2 kubenswrapper[4762]: I1014 13:23:16.569040 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.569965 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-service-ca\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.570634 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.572057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-audit-policies\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.572073 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.572318 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-login\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.572763 master-2 kubenswrapper[4762]: I1014 13:23:16.572411 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.573733 master-2 kubenswrapper[4762]: I1014 13:23:16.573355 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.574331 master-2 kubenswrapper[4762]: I1014 13:23:16.574306 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-router-certs\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.574997 master-2 kubenswrapper[4762]: I1014 13:23:16.574892 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-system-session\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.576299 master-2 kubenswrapper[4762]: I1014 13:23:16.576265 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.576787 master-2 kubenswrapper[4762]: I1014 13:23:16.576726 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a9723af-f63a-45cd-9456-b4d67c4d778a-v4-0-config-user-template-error\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.590369 master-2 kubenswrapper[4762]: I1014 13:23:16.590348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d27bc\" (UniqueName: \"kubernetes.io/projected/6a9723af-f63a-45cd-9456-b4d67c4d778a-kube-api-access-d27bc\") pod \"oauth-openshift-65687bc9c8-twgxt\" (UID: \"6a9723af-f63a-45cd-9456-b4d67c4d778a\") " pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:16.722274 master-2 kubenswrapper[4762]: I1014 13:23:16.722217 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:17.161683 master-2 kubenswrapper[4762]: I1014 13:23:17.161618 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65687bc9c8-twgxt"] Oct 14 13:23:17.168657 master-2 kubenswrapper[4762]: W1014 13:23:17.168583 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a9723af_f63a_45cd_9456_b4d67c4d778a.slice/crio-78e25f8436c7f970c226877ffe83c8ebea6659327cf95da040f46ac48e78ea7c WatchSource:0}: Error finding container 78e25f8436c7f970c226877ffe83c8ebea6659327cf95da040f46ac48e78ea7c: Status 404 returned error can't find the container with id 78e25f8436c7f970c226877ffe83c8ebea6659327cf95da040f46ac48e78ea7c Oct 14 13:23:17.890226 master-2 kubenswrapper[4762]: I1014 13:23:17.889905 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" event={"ID":"6a9723af-f63a-45cd-9456-b4d67c4d778a","Type":"ContainerStarted","Data":"a0361b5cf78f6775ca15f697346788e1e7d6558cad74f0bf768bd30981607fe2"} Oct 14 13:23:17.890226 master-2 kubenswrapper[4762]: I1014 13:23:17.889977 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" event={"ID":"6a9723af-f63a-45cd-9456-b4d67c4d778a","Type":"ContainerStarted","Data":"78e25f8436c7f970c226877ffe83c8ebea6659327cf95da040f46ac48e78ea7c"} Oct 14 13:23:17.891085 master-2 kubenswrapper[4762]: I1014 13:23:17.890732 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:17.896663 master-2 kubenswrapper[4762]: I1014 13:23:17.896600 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" Oct 14 13:23:17.945053 master-2 kubenswrapper[4762]: I1014 13:23:17.944955 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65687bc9c8-twgxt" podStartSLOduration=36.944932469 podStartE2EDuration="36.944932469s" podCreationTimestamp="2025-10-14 13:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:23:17.936535392 +0000 UTC m=+1027.180694591" watchObservedRunningTime="2025-10-14 13:23:17.944932469 +0000 UTC m=+1027.189091628" Oct 14 13:23:32.615415 master-2 kubenswrapper[4762]: I1014 13:23:32.615254 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:23:56.243576 master-2 kubenswrapper[4762]: I1014 13:23:56.242586 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:56.243576 master-2 kubenswrapper[4762]: I1014 13:23:56.243192 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="alertmanager" containerID="cri-o://6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" gracePeriod=120 Oct 14 13:23:56.243576 master-2 kubenswrapper[4762]: I1014 13:23:56.243405 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="prom-label-proxy" containerID="cri-o://61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" gracePeriod=120 Oct 14 13:23:56.243576 master-2 kubenswrapper[4762]: I1014 13:23:56.243485 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-metric" containerID="cri-o://5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" gracePeriod=120 Oct 14 13:23:56.243576 master-2 kubenswrapper[4762]: I1014 13:23:56.243544 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy" containerID="cri-o://db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" gracePeriod=120 Oct 14 13:23:56.244912 master-2 kubenswrapper[4762]: I1014 13:23:56.243601 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-web" containerID="cri-o://69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" gracePeriod=120 Oct 14 13:23:56.244912 master-2 kubenswrapper[4762]: I1014 13:23:56.243653 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="config-reloader" containerID="cri-o://9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" gracePeriod=120 Oct 14 13:23:57.172640 master-2 kubenswrapper[4762]: I1014 13:23:57.172546 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" exitCode=0 Oct 14 13:23:57.172640 master-2 kubenswrapper[4762]: I1014 13:23:57.172605 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" exitCode=0 Oct 14 13:23:57.172640 master-2 kubenswrapper[4762]: I1014 13:23:57.172622 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" exitCode=0 Oct 14 13:23:57.172640 master-2 kubenswrapper[4762]: I1014 13:23:57.172636 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" exitCode=0 Oct 14 13:23:57.173149 master-2 kubenswrapper[4762]: I1014 13:23:57.172652 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae"} Oct 14 13:23:57.173149 master-2 kubenswrapper[4762]: I1014 13:23:57.172722 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c"} Oct 14 13:23:57.173149 master-2 kubenswrapper[4762]: I1014 13:23:57.172747 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066"} Oct 14 13:23:57.173149 master-2 kubenswrapper[4762]: I1014 13:23:57.172770 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653"} Oct 14 13:23:57.659534 master-2 kubenswrapper[4762]: I1014 13:23:57.659308 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-554dc689f9-c5k9h" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerName="console" containerID="cri-o://9179376181c8cf0140daf02269d045ff57d8f106632284ef45b08ee2f82b226b" gracePeriod=15 Oct 14 13:23:57.822140 master-2 kubenswrapper[4762]: I1014 13:23:57.822098 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:57.988776 master-2 kubenswrapper[4762]: I1014 13:23:57.988690 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.988776 master-2 kubenswrapper[4762]: I1014 13:23:57.988753 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.988776 master-2 kubenswrapper[4762]: I1014 13:23:57.988792 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.988836 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.988877 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.988910 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.988950 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.989001 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.989044 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69qvh\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.989062 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.989080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989314 master-2 kubenswrapper[4762]: I1014 13:23:57.989112 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume\") pod \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\" (UID: \"f3a1ec2c-6bca-4616-8c89-060a6d1593e6\") " Oct 14 13:23:57.989701 master-2 kubenswrapper[4762]: I1014 13:23:57.989453 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:57.989701 master-2 kubenswrapper[4762]: I1014 13:23:57.989566 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:57.990046 master-2 kubenswrapper[4762]: I1014 13:23:57.989961 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:23:57.996301 master-2 kubenswrapper[4762]: I1014 13:23:57.996227 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:57.999739 master-2 kubenswrapper[4762]: I1014 13:23:57.999647 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh" (OuterVolumeSpecName: "kube-api-access-69qvh") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "kube-api-access-69qvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:23:57.999894 master-2 kubenswrapper[4762]: I1014 13:23:57.999801 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume" (OuterVolumeSpecName: "config-volume") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.000037 master-2 kubenswrapper[4762]: I1014 13:23:57.999997 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:23:58.001308 master-2 kubenswrapper[4762]: I1014 13:23:58.001279 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.003309 master-2 kubenswrapper[4762]: I1014 13:23:58.003248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.003424 master-2 kubenswrapper[4762]: I1014 13:23:58.003391 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out" (OuterVolumeSpecName: "config-out") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:23:58.024646 master-2 kubenswrapper[4762]: I1014 13:23:58.024355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.045243 master-2 kubenswrapper[4762]: I1014 13:23:58.045195 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config" (OuterVolumeSpecName: "web-config") pod "f3a1ec2c-6bca-4616-8c89-060a6d1593e6" (UID: "f3a1ec2c-6bca-4616-8c89-060a6d1593e6"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.090513 master-2 kubenswrapper[4762]: I1014 13:23:58.090461 4762 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-main-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090513 master-2 kubenswrapper[4762]: I1014 13:23:58.090499 4762 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090513 master-2 kubenswrapper[4762]: I1014 13:23:58.090515 4762 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-metrics-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090533 4762 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-out\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090543 4762 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090556 4762 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090570 4762 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-alertmanager-main-db\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090581 4762 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-web-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090591 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69qvh\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-kube-api-access-69qvh\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090601 4762 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-tls-assets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090612 4762 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.090999 master-2 kubenswrapper[4762]: I1014 13:23:58.090623 4762 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f3a1ec2c-6bca-4616-8c89-060a6d1593e6-config-volume\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.182926 master-2 kubenswrapper[4762]: I1014 13:23:58.182793 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" exitCode=0 Oct 14 13:23:58.182926 master-2 kubenswrapper[4762]: I1014 13:23:58.182843 4762 generic.go:334] "Generic (PLEG): container finished" podID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerID="69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" exitCode=0 Oct 14 13:23:58.182926 master-2 kubenswrapper[4762]: I1014 13:23:58.182842 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c"} Oct 14 13:23:58.183219 master-2 kubenswrapper[4762]: I1014 13:23:58.182924 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10"} Oct 14 13:23:58.183219 master-2 kubenswrapper[4762]: I1014 13:23:58.182964 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.183219 master-2 kubenswrapper[4762]: I1014 13:23:58.182998 4762 scope.go:117] "RemoveContainer" containerID="61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" Oct 14 13:23:58.183219 master-2 kubenswrapper[4762]: I1014 13:23:58.182982 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f3a1ec2c-6bca-4616-8c89-060a6d1593e6","Type":"ContainerDied","Data":"90b247384c26b9abe62364552eb19ccbefce9f75b9a58d7f5d2f77d11d6ab6ca"} Oct 14 13:23:58.184857 master-2 kubenswrapper[4762]: I1014 13:23:58.184828 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554dc689f9-c5k9h_7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d/console/0.log" Oct 14 13:23:58.184904 master-2 kubenswrapper[4762]: I1014 13:23:58.184870 4762 generic.go:334] "Generic (PLEG): container finished" podID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerID="9179376181c8cf0140daf02269d045ff57d8f106632284ef45b08ee2f82b226b" exitCode=2 Oct 14 13:23:58.184904 master-2 kubenswrapper[4762]: I1014 13:23:58.184895 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554dc689f9-c5k9h" event={"ID":"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d","Type":"ContainerDied","Data":"9179376181c8cf0140daf02269d045ff57d8f106632284ef45b08ee2f82b226b"} Oct 14 13:23:58.200387 master-2 kubenswrapper[4762]: I1014 13:23:58.200349 4762 scope.go:117] "RemoveContainer" containerID="5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" Oct 14 13:23:58.214878 master-2 kubenswrapper[4762]: I1014 13:23:58.214824 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554dc689f9-c5k9h_7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d/console/0.log" Oct 14 13:23:58.215033 master-2 kubenswrapper[4762]: I1014 13:23:58.214946 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:23:58.216349 master-2 kubenswrapper[4762]: I1014 13:23:58.216306 4762 scope.go:117] "RemoveContainer" containerID="db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" Oct 14 13:23:58.246583 master-2 kubenswrapper[4762]: I1014 13:23:58.246275 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:58.249990 master-2 kubenswrapper[4762]: I1014 13:23:58.249946 4762 scope.go:117] "RemoveContainer" containerID="69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" Oct 14 13:23:58.258573 master-2 kubenswrapper[4762]: I1014 13:23:58.258524 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:58.267939 master-2 kubenswrapper[4762]: I1014 13:23:58.267875 4762 scope.go:117] "RemoveContainer" containerID="9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" Oct 14 13:23:58.283141 master-2 kubenswrapper[4762]: I1014 13:23:58.283106 4762 scope.go:117] "RemoveContainer" containerID="6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" Oct 14 13:23:58.299763 master-2 kubenswrapper[4762]: I1014 13:23:58.299716 4762 scope.go:117] "RemoveContainer" containerID="44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5" Oct 14 13:23:58.311689 master-2 kubenswrapper[4762]: I1014 13:23:58.311667 4762 scope.go:117] "RemoveContainer" containerID="61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" Oct 14 13:23:58.312077 master-2 kubenswrapper[4762]: E1014 13:23:58.312044 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae\": container with ID starting with 61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae not found: ID does not exist" containerID="61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" Oct 14 13:23:58.312131 master-2 kubenswrapper[4762]: I1014 13:23:58.312083 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae"} err="failed to get container status \"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae\": rpc error: code = NotFound desc = could not find container \"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae\": container with ID starting with 61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae not found: ID does not exist" Oct 14 13:23:58.312131 master-2 kubenswrapper[4762]: I1014 13:23:58.312105 4762 scope.go:117] "RemoveContainer" containerID="5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" Oct 14 13:23:58.312396 master-2 kubenswrapper[4762]: E1014 13:23:58.312372 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c\": container with ID starting with 5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c not found: ID does not exist" containerID="5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" Oct 14 13:23:58.312444 master-2 kubenswrapper[4762]: I1014 13:23:58.312401 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c"} err="failed to get container status \"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c\": rpc error: code = NotFound desc = could not find container \"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c\": container with ID starting with 5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c not found: ID does not exist" Oct 14 13:23:58.312444 master-2 kubenswrapper[4762]: I1014 13:23:58.312420 4762 scope.go:117] "RemoveContainer" containerID="db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" Oct 14 13:23:58.312697 master-2 kubenswrapper[4762]: E1014 13:23:58.312677 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c\": container with ID starting with db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c not found: ID does not exist" containerID="db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" Oct 14 13:23:58.312744 master-2 kubenswrapper[4762]: I1014 13:23:58.312700 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c"} err="failed to get container status \"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c\": rpc error: code = NotFound desc = could not find container \"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c\": container with ID starting with db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c not found: ID does not exist" Oct 14 13:23:58.312744 master-2 kubenswrapper[4762]: I1014 13:23:58.312715 4762 scope.go:117] "RemoveContainer" containerID="69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" Oct 14 13:23:58.313021 master-2 kubenswrapper[4762]: E1014 13:23:58.313001 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10\": container with ID starting with 69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10 not found: ID does not exist" containerID="69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" Oct 14 13:23:58.313075 master-2 kubenswrapper[4762]: I1014 13:23:58.313020 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10"} err="failed to get container status \"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10\": rpc error: code = NotFound desc = could not find container \"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10\": container with ID starting with 69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10 not found: ID does not exist" Oct 14 13:23:58.313075 master-2 kubenswrapper[4762]: I1014 13:23:58.313036 4762 scope.go:117] "RemoveContainer" containerID="9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" Oct 14 13:23:58.313323 master-2 kubenswrapper[4762]: E1014 13:23:58.313303 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066\": container with ID starting with 9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066 not found: ID does not exist" containerID="9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" Oct 14 13:23:58.313365 master-2 kubenswrapper[4762]: I1014 13:23:58.313327 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066"} err="failed to get container status \"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066\": rpc error: code = NotFound desc = could not find container \"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066\": container with ID starting with 9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066 not found: ID does not exist" Oct 14 13:23:58.313365 master-2 kubenswrapper[4762]: I1014 13:23:58.313344 4762 scope.go:117] "RemoveContainer" containerID="6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" Oct 14 13:23:58.313602 master-2 kubenswrapper[4762]: E1014 13:23:58.313582 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653\": container with ID starting with 6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653 not found: ID does not exist" containerID="6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" Oct 14 13:23:58.313638 master-2 kubenswrapper[4762]: I1014 13:23:58.313605 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653"} err="failed to get container status \"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653\": rpc error: code = NotFound desc = could not find container \"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653\": container with ID starting with 6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653 not found: ID does not exist" Oct 14 13:23:58.313638 master-2 kubenswrapper[4762]: I1014 13:23:58.313622 4762 scope.go:117] "RemoveContainer" containerID="44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5" Oct 14 13:23:58.313933 master-2 kubenswrapper[4762]: E1014 13:23:58.313905 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5\": container with ID starting with 44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5 not found: ID does not exist" containerID="44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5" Oct 14 13:23:58.313970 master-2 kubenswrapper[4762]: I1014 13:23:58.313929 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5"} err="failed to get container status \"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5\": rpc error: code = NotFound desc = could not find container \"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5\": container with ID starting with 44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5 not found: ID does not exist" Oct 14 13:23:58.313970 master-2 kubenswrapper[4762]: I1014 13:23:58.313945 4762 scope.go:117] "RemoveContainer" containerID="61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae" Oct 14 13:23:58.314205 master-2 kubenswrapper[4762]: I1014 13:23:58.314175 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae"} err="failed to get container status \"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae\": rpc error: code = NotFound desc = could not find container \"61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae\": container with ID starting with 61ef651b05ea840eab2c068d6d58ec8042dcf01fd522926c14c55bb613bf72ae not found: ID does not exist" Oct 14 13:23:58.314205 master-2 kubenswrapper[4762]: I1014 13:23:58.314199 4762 scope.go:117] "RemoveContainer" containerID="5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c" Oct 14 13:23:58.314588 master-2 kubenswrapper[4762]: I1014 13:23:58.314537 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c"} err="failed to get container status \"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c\": rpc error: code = NotFound desc = could not find container \"5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c\": container with ID starting with 5869b13a06e6ed26789a9c602e58de4c8d5400e673f1c19a805743b82544399c not found: ID does not exist" Oct 14 13:23:58.314637 master-2 kubenswrapper[4762]: I1014 13:23:58.314593 4762 scope.go:117] "RemoveContainer" containerID="db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c" Oct 14 13:23:58.315006 master-2 kubenswrapper[4762]: I1014 13:23:58.314977 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c"} err="failed to get container status \"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c\": rpc error: code = NotFound desc = could not find container \"db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c\": container with ID starting with db6d3f963bf9a1ae70b91164570d0048d23e5436bd0c85da5957762466492b3c not found: ID does not exist" Oct 14 13:23:58.315052 master-2 kubenswrapper[4762]: I1014 13:23:58.315033 4762 scope.go:117] "RemoveContainer" containerID="69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10" Oct 14 13:23:58.315349 master-2 kubenswrapper[4762]: I1014 13:23:58.315323 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10"} err="failed to get container status \"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10\": rpc error: code = NotFound desc = could not find container \"69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10\": container with ID starting with 69571f2f9f9d8f5979fb43994b5a39a24ccfd93bb03da6f14031d94d7957bd10 not found: ID does not exist" Oct 14 13:23:58.315418 master-2 kubenswrapper[4762]: I1014 13:23:58.315348 4762 scope.go:117] "RemoveContainer" containerID="9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066" Oct 14 13:23:58.315656 master-2 kubenswrapper[4762]: I1014 13:23:58.315627 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066"} err="failed to get container status \"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066\": rpc error: code = NotFound desc = could not find container \"9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066\": container with ID starting with 9af3295f31acd2a582c426fd406751a3b6bd4feaff9d8d9bc819368479541066 not found: ID does not exist" Oct 14 13:23:58.315656 master-2 kubenswrapper[4762]: I1014 13:23:58.315647 4762 scope.go:117] "RemoveContainer" containerID="6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653" Oct 14 13:23:58.315947 master-2 kubenswrapper[4762]: I1014 13:23:58.315912 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653"} err="failed to get container status \"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653\": rpc error: code = NotFound desc = could not find container \"6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653\": container with ID starting with 6c278f0bb3a105138295da1accc27fd02f8142c5b30433066f034a0761598653 not found: ID does not exist" Oct 14 13:23:58.315947 master-2 kubenswrapper[4762]: I1014 13:23:58.315937 4762 scope.go:117] "RemoveContainer" containerID="44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5" Oct 14 13:23:58.316275 master-2 kubenswrapper[4762]: I1014 13:23:58.316245 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5"} err="failed to get container status \"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5\": rpc error: code = NotFound desc = could not find container \"44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5\": container with ID starting with 44694c42e3f2d02364f1f7f5a782414ce6e55bb7d25db37e11db100003478ae5 not found: ID does not exist" Oct 14 13:23:58.394324 master-2 kubenswrapper[4762]: I1014 13:23:58.394251 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394359 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394398 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394475 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394515 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394590 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395316 master-2 kubenswrapper[4762]: I1014 13:23:58.394637 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cd8x\" (UniqueName: \"kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x\") pod \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\" (UID: \"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d\") " Oct 14 13:23:58.395899 master-2 kubenswrapper[4762]: I1014 13:23:58.395419 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config" (OuterVolumeSpecName: "console-config") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:58.395899 master-2 kubenswrapper[4762]: I1014 13:23:58.395508 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:58.395899 master-2 kubenswrapper[4762]: I1014 13:23:58.395609 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:58.396441 master-2 kubenswrapper[4762]: I1014 13:23:58.396377 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca" (OuterVolumeSpecName: "service-ca") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:23:58.398525 master-2 kubenswrapper[4762]: I1014 13:23:58.398463 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.398857 master-2 kubenswrapper[4762]: I1014 13:23:58.398796 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x" (OuterVolumeSpecName: "kube-api-access-4cd8x") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "kube-api-access-4cd8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:23:58.399748 master-2 kubenswrapper[4762]: I1014 13:23:58.399669 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" (UID: "7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:23:58.488682 master-2 kubenswrapper[4762]: I1014 13:23:58.488483 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: E1014 13:23:58.488836 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="alertmanager" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: I1014 13:23:58.488867 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="alertmanager" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: E1014 13:23:58.488913 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="config-reloader" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: I1014 13:23:58.488929 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="config-reloader" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: E1014 13:23:58.488949 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: I1014 13:23:58.488966 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: E1014 13:23:58.488985 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-metric" Oct 14 13:23:58.488994 master-2 kubenswrapper[4762]: I1014 13:23:58.489001 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-metric" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: E1014 13:23:58.489020 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="prom-label-proxy" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489036 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="prom-label-proxy" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: E1014 13:23:58.489057 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-web" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489074 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-web" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: E1014 13:23:58.489100 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerName="console" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489116 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerName="console" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: E1014 13:23:58.489191 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="init-config-reloader" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489214 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="init-config-reloader" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489420 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489450 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-metric" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489469 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" containerName="console" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489486 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="kube-rbac-proxy-web" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489515 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="config-reloader" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489536 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="prom-label-proxy" Oct 14 13:23:58.489764 master-2 kubenswrapper[4762]: I1014 13:23:58.489560 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" containerName="alertmanager" Oct 14 13:23:58.492989 master-2 kubenswrapper[4762]: I1014 13:23:58.492929 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.496387 master-2 kubenswrapper[4762]: I1014 13:23:58.496324 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-oauth-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496387 master-2 kubenswrapper[4762]: I1014 13:23:58.496377 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cd8x\" (UniqueName: \"kubernetes.io/projected/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-kube-api-access-4cd8x\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496741 master-2 kubenswrapper[4762]: I1014 13:23:58.496405 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496741 master-2 kubenswrapper[4762]: I1014 13:23:58.496430 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-oauth-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496741 master-2 kubenswrapper[4762]: I1014 13:23:58.496454 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-service-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496741 master-2 kubenswrapper[4762]: I1014 13:23:58.496481 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.496741 master-2 kubenswrapper[4762]: I1014 13:23:58.496506 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d-console-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:23:58.497342 master-2 kubenswrapper[4762]: I1014 13:23:58.497286 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-z89cl" Oct 14 13:23:58.498104 master-2 kubenswrapper[4762]: I1014 13:23:58.498026 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Oct 14 13:23:58.498999 master-2 kubenswrapper[4762]: I1014 13:23:58.498939 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Oct 14 13:23:58.499128 master-2 kubenswrapper[4762]: I1014 13:23:58.499006 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Oct 14 13:23:58.499298 master-2 kubenswrapper[4762]: I1014 13:23:58.499137 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Oct 14 13:23:58.499298 master-2 kubenswrapper[4762]: I1014 13:23:58.499015 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Oct 14 13:23:58.499775 master-2 kubenswrapper[4762]: I1014 13:23:58.499723 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Oct 14 13:23:58.504093 master-2 kubenswrapper[4762]: I1014 13:23:58.501588 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Oct 14 13:23:58.514066 master-2 kubenswrapper[4762]: I1014 13:23:58.513802 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Oct 14 13:23:58.597749 master-2 kubenswrapper[4762]: I1014 13:23:58.597632 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.597749 master-2 kubenswrapper[4762]: I1014 13:23:58.597761 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlp6\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-kube-api-access-wdlp6\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.597749 master-2 kubenswrapper[4762]: I1014 13:23:58.597839 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-tls-assets\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.597749 master-2 kubenswrapper[4762]: I1014 13:23:58.597890 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-config-volume\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.599521 master-2 kubenswrapper[4762]: I1014 13:23:58.598855 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.599521 master-2 kubenswrapper[4762]: I1014 13:23:58.599030 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-web-config\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.599521 master-2 kubenswrapper[4762]: I1014 13:23:58.599113 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.599521 master-2 kubenswrapper[4762]: I1014 13:23:58.599402 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.600081 master-2 kubenswrapper[4762]: I1014 13:23:58.599998 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.600081 master-2 kubenswrapper[4762]: I1014 13:23:58.600044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.600268 master-2 kubenswrapper[4762]: I1014 13:23:58.600128 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.600268 master-2 kubenswrapper[4762]: I1014 13:23:58.600247 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-config-out\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.689268 master-2 kubenswrapper[4762]: I1014 13:23:58.687558 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:58.701195 master-2 kubenswrapper[4762]: I1014 13:23:58.701096 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701195 master-2 kubenswrapper[4762]: I1014 13:23:58.701174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701216 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701257 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-config-out\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701285 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701323 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlp6\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-kube-api-access-wdlp6\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701354 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-tls-assets\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701375 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-config-volume\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701399 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-web-config\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701640 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.701771 master-2 kubenswrapper[4762]: I1014 13:23:58.701663 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.703801 master-2 kubenswrapper[4762]: I1014 13:23:58.703721 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.705840 master-2 kubenswrapper[4762]: I1014 13:23:58.705749 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.707706 master-2 kubenswrapper[4762]: I1014 13:23:58.707595 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294de25b-260e-4ac0-89f0-a08608e56eca-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.708803 master-2 kubenswrapper[4762]: I1014 13:23:58.708752 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.715268 master-2 kubenswrapper[4762]: I1014 13:23:58.715111 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.715528 master-2 kubenswrapper[4762]: I1014 13:23:58.715219 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-config-volume\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.715688 master-2 kubenswrapper[4762]: I1014 13:23:58.715512 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-tls-assets\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.720521 master-2 kubenswrapper[4762]: I1014 13:23:58.716182 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.722901 master-2 kubenswrapper[4762]: I1014 13:23:58.721607 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/294de25b-260e-4ac0-89f0-a08608e56eca-config-out\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.722901 master-2 kubenswrapper[4762]: I1014 13:23:58.722452 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-web-config\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.731259 master-2 kubenswrapper[4762]: I1014 13:23:58.726502 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/294de25b-260e-4ac0-89f0-a08608e56eca-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.757612 master-2 kubenswrapper[4762]: I1014 13:23:58.757344 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlp6\" (UniqueName: \"kubernetes.io/projected/294de25b-260e-4ac0-89f0-a08608e56eca-kube-api-access-wdlp6\") pod \"alertmanager-main-0\" (UID: \"294de25b-260e-4ac0-89f0-a08608e56eca\") " pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:58.818916 master-2 kubenswrapper[4762]: I1014 13:23:58.818815 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:23:59.195662 master-2 kubenswrapper[4762]: I1014 13:23:59.195605 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-554dc689f9-c5k9h_7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d/console/0.log" Oct 14 13:23:59.195884 master-2 kubenswrapper[4762]: I1014 13:23:59.195814 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-554dc689f9-c5k9h" Oct 14 13:23:59.195884 master-2 kubenswrapper[4762]: I1014 13:23:59.195803 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-554dc689f9-c5k9h" event={"ID":"7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d","Type":"ContainerDied","Data":"68699cf50ddb6de45bd7c8683a7ec2d177cb6f35bbe1c743077e0c63066f5761"} Oct 14 13:23:59.195953 master-2 kubenswrapper[4762]: I1014 13:23:59.195911 4762 scope.go:117] "RemoveContainer" containerID="9179376181c8cf0140daf02269d045ff57d8f106632284ef45b08ee2f82b226b" Oct 14 13:23:59.256527 master-2 kubenswrapper[4762]: I1014 13:23:59.256450 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:23:59.268410 master-2 kubenswrapper[4762]: I1014 13:23:59.268266 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-554dc689f9-c5k9h"] Oct 14 13:23:59.273074 master-2 kubenswrapper[4762]: I1014 13:23:59.273015 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Oct 14 13:23:59.279073 master-2 kubenswrapper[4762]: W1014 13:23:59.278994 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294de25b_260e_4ac0_89f0_a08608e56eca.slice/crio-be2f036b961229e881f5623c5c5d1d286b06d55dad7dd9bcabbb78e0bc8d68af WatchSource:0}: Error finding container be2f036b961229e881f5623c5c5d1d286b06d55dad7dd9bcabbb78e0bc8d68af: Status 404 returned error can't find the container with id be2f036b961229e881f5623c5c5d1d286b06d55dad7dd9bcabbb78e0bc8d68af Oct 14 13:23:59.559930 master-2 kubenswrapper[4762]: I1014 13:23:59.559812 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d" path="/var/lib/kubelet/pods/7b40f213-fb3c-4692-ac1a-f6a8e3dd0f7d/volumes" Oct 14 13:23:59.561262 master-2 kubenswrapper[4762]: I1014 13:23:59.561189 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a1ec2c-6bca-4616-8c89-060a6d1593e6" path="/var/lib/kubelet/pods/f3a1ec2c-6bca-4616-8c89-060a6d1593e6/volumes" Oct 14 13:24:00.209057 master-2 kubenswrapper[4762]: I1014 13:24:00.208990 4762 generic.go:334] "Generic (PLEG): container finished" podID="294de25b-260e-4ac0-89f0-a08608e56eca" containerID="307e5efe1839f56d3f84cf49f3ed59c4336f3eb8705f4e66e26a4506400ed866" exitCode=0 Oct 14 13:24:00.209966 master-2 kubenswrapper[4762]: I1014 13:24:00.209080 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerDied","Data":"307e5efe1839f56d3f84cf49f3ed59c4336f3eb8705f4e66e26a4506400ed866"} Oct 14 13:24:00.209966 master-2 kubenswrapper[4762]: I1014 13:24:00.209120 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"be2f036b961229e881f5623c5c5d1d286b06d55dad7dd9bcabbb78e0bc8d68af"} Oct 14 13:24:01.221661 master-2 kubenswrapper[4762]: I1014 13:24:01.221545 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"12af42379aea6abfde11e08a700a407205adf6498f2c469f2c9b506c5ec86e07"} Oct 14 13:24:01.221661 master-2 kubenswrapper[4762]: I1014 13:24:01.221622 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"ab8c516952e02a344cc94ce5b0444090709bc014de9fef5e23c1759a631d0f7e"} Oct 14 13:24:01.221661 master-2 kubenswrapper[4762]: I1014 13:24:01.221652 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"eb32bed61a49755378c34f4831e057010abb4a2b6f147ae274603aed9fcb7a39"} Oct 14 13:24:01.221661 master-2 kubenswrapper[4762]: I1014 13:24:01.221677 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"559e511e4305afd6ff7bdf2be833ae1c5bea4aecb467c8b1e120134b2ba4d638"} Oct 14 13:24:01.222906 master-2 kubenswrapper[4762]: I1014 13:24:01.221702 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"5250a785260c7dafbc218578cc376d13753908182fddbe68156d419cb34827ba"} Oct 14 13:24:01.222906 master-2 kubenswrapper[4762]: I1014 13:24:01.221725 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"294de25b-260e-4ac0-89f0-a08608e56eca","Type":"ContainerStarted","Data":"6f9e962012300bb72a43a4561761e47502cd2cf9073f12d712a22a7afed90501"} Oct 14 13:24:01.276563 master-2 kubenswrapper[4762]: I1014 13:24:01.276384 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.276366988 podStartE2EDuration="3.276366988s" podCreationTimestamp="2025-10-14 13:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:24:01.2742258 +0000 UTC m=+1070.518384989" watchObservedRunningTime="2025-10-14 13:24:01.276366988 +0000 UTC m=+1070.520526147" Oct 14 13:24:08.819661 master-2 kubenswrapper[4762]: I1014 13:24:08.819605 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:24:12.659149 master-2 kubenswrapper[4762]: I1014 13:24:12.659047 4762 scope.go:117] "RemoveContainer" containerID="f6b31a7fd130b6260cc195b3f7d19cba7465489352bc38f184c6b11a0414791d" Oct 14 13:24:28.864422 master-2 kubenswrapper[4762]: I1014 13:24:28.864241 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/alertmanager-main-0" Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382024 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382409 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="prometheus" containerID="cri-o://3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" gracePeriod=600 Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382434 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy" containerID="cri-o://8cb6e7f57317f98acf1f86c1f4624231ddf922c2321b7ca8f5ee56a1e4c66274" gracePeriod=600 Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382471 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="thanos-sidecar" containerID="cri-o://68586ae70faec91fa7469971b38394705160337f7f18cb0eff9978037eda6496" gracePeriod=600 Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382559 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-web" containerID="cri-o://9a307c52f472dbea0756ef79dc081e7c7f818a13fbf1f1667281a7a4465acb42" gracePeriod=600 Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382598 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="config-reloader" containerID="cri-o://3b512fead26f68b391af64674e86d0e7db890a4f2e2def8b93b1f890a60f4978" gracePeriod=600 Oct 14 13:24:44.383310 master-2 kubenswrapper[4762]: I1014 13:24:44.382563 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-thanos" containerID="cri-o://572aea723a01273b8c89bc2d0322d30c45f4601cb9a1fb063f1ae4afb4dfc654" gracePeriod=600 Oct 14 13:24:44.411599 master-2 kubenswrapper[4762]: I1014 13:24:44.410111 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:24:44.411599 master-2 kubenswrapper[4762]: I1014 13:24:44.411267 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.418606 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-r2r7j" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.419270 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.419528 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.419569 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.419696 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 14 13:24:44.420754 master-2 kubenswrapper[4762]: I1014 13:24:44.419783 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 14 13:24:44.427323 master-2 kubenswrapper[4762]: I1014 13:24:44.427291 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 14 13:24:44.454978 master-2 kubenswrapper[4762]: I1014 13:24:44.454938 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:24:44.526617 master-2 kubenswrapper[4762]: I1014 13:24:44.526574 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="9a307c52f472dbea0756ef79dc081e7c7f818a13fbf1f1667281a7a4465acb42" exitCode=0 Oct 14 13:24:44.526617 master-2 kubenswrapper[4762]: I1014 13:24:44.526611 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="68586ae70faec91fa7469971b38394705160337f7f18cb0eff9978037eda6496" exitCode=0 Oct 14 13:24:44.526738 master-2 kubenswrapper[4762]: I1014 13:24:44.526610 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"9a307c52f472dbea0756ef79dc081e7c7f818a13fbf1f1667281a7a4465acb42"} Oct 14 13:24:44.526738 master-2 kubenswrapper[4762]: I1014 13:24:44.526656 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"68586ae70faec91fa7469971b38394705160337f7f18cb0eff9978037eda6496"} Oct 14 13:24:44.571627 master-2 kubenswrapper[4762]: I1014 13:24:44.571563 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571724 master-2 kubenswrapper[4762]: I1014 13:24:44.571651 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571724 master-2 kubenswrapper[4762]: I1014 13:24:44.571679 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571793 master-2 kubenswrapper[4762]: I1014 13:24:44.571749 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571793 master-2 kubenswrapper[4762]: I1014 13:24:44.571771 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571861 master-2 kubenswrapper[4762]: I1014 13:24:44.571822 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.571861 master-2 kubenswrapper[4762]: I1014 13:24:44.571841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwq2\" (UniqueName: \"kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675029 master-2 kubenswrapper[4762]: I1014 13:24:44.674965 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675029 master-2 kubenswrapper[4762]: I1014 13:24:44.675031 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675347 master-2 kubenswrapper[4762]: I1014 13:24:44.675078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675347 master-2 kubenswrapper[4762]: I1014 13:24:44.675107 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwq2\" (UniqueName: \"kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675347 master-2 kubenswrapper[4762]: I1014 13:24:44.675174 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675347 master-2 kubenswrapper[4762]: I1014 13:24:44.675201 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.675347 master-2 kubenswrapper[4762]: I1014 13:24:44.675238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.676233 master-2 kubenswrapper[4762]: I1014 13:24:44.676210 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.676292 master-2 kubenswrapper[4762]: I1014 13:24:44.676224 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.676429 master-2 kubenswrapper[4762]: I1014 13:24:44.676388 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.677397 master-2 kubenswrapper[4762]: I1014 13:24:44.677372 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.686485 master-2 kubenswrapper[4762]: I1014 13:24:44.686440 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.686774 master-2 kubenswrapper[4762]: I1014 13:24:44.686742 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.695906 master-2 kubenswrapper[4762]: I1014 13:24:44.695870 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwq2\" (UniqueName: \"kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2\") pod \"console-564c479f-7bglk\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:44.735819 master-2 kubenswrapper[4762]: I1014 13:24:44.735759 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:45.148680 master-2 kubenswrapper[4762]: I1014 13:24:45.147451 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:24:45.542127 master-2 kubenswrapper[4762]: I1014 13:24:45.542052 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="572aea723a01273b8c89bc2d0322d30c45f4601cb9a1fb063f1ae4afb4dfc654" exitCode=0 Oct 14 13:24:45.542127 master-2 kubenswrapper[4762]: I1014 13:24:45.542095 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="8cb6e7f57317f98acf1f86c1f4624231ddf922c2321b7ca8f5ee56a1e4c66274" exitCode=0 Oct 14 13:24:45.542127 master-2 kubenswrapper[4762]: I1014 13:24:45.542104 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="3b512fead26f68b391af64674e86d0e7db890a4f2e2def8b93b1f890a60f4978" exitCode=0 Oct 14 13:24:45.542127 master-2 kubenswrapper[4762]: I1014 13:24:45.542116 4762 generic.go:334] "Generic (PLEG): container finished" podID="d766696a-7ad4-4921-b799-c65f51b60109" containerID="3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" exitCode=0 Oct 14 13:24:45.543172 master-2 kubenswrapper[4762]: I1014 13:24:45.542213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"572aea723a01273b8c89bc2d0322d30c45f4601cb9a1fb063f1ae4afb4dfc654"} Oct 14 13:24:45.543172 master-2 kubenswrapper[4762]: I1014 13:24:45.542248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"8cb6e7f57317f98acf1f86c1f4624231ddf922c2321b7ca8f5ee56a1e4c66274"} Oct 14 13:24:45.543172 master-2 kubenswrapper[4762]: I1014 13:24:45.542263 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"3b512fead26f68b391af64674e86d0e7db890a4f2e2def8b93b1f890a60f4978"} Oct 14 13:24:45.543172 master-2 kubenswrapper[4762]: I1014 13:24:45.542280 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f"} Oct 14 13:24:45.544627 master-2 kubenswrapper[4762]: I1014 13:24:45.544580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-564c479f-7bglk" event={"ID":"9fd6eab3-bc4f-437c-ab20-8db15e2ec157","Type":"ContainerStarted","Data":"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9"} Oct 14 13:24:45.544627 master-2 kubenswrapper[4762]: I1014 13:24:45.544616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-564c479f-7bglk" event={"ID":"9fd6eab3-bc4f-437c-ab20-8db15e2ec157","Type":"ContainerStarted","Data":"641c438d98ddbe736332955ba68acf2c69e0edf0cb6be1aaf160c54a6e90e65e"} Oct 14 13:24:45.599551 master-2 kubenswrapper[4762]: I1014 13:24:45.599450 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-564c479f-7bglk" podStartSLOduration=1.599401709 podStartE2EDuration="1.599401709s" podCreationTimestamp="2025-10-14 13:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:24:45.59629844 +0000 UTC m=+1114.840457599" watchObservedRunningTime="2025-10-14 13:24:45.599401709 +0000 UTC m=+1114.843560868" Oct 14 13:24:45.698288 master-2 kubenswrapper[4762]: E1014 13:24:45.698118 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f is running failed: container process not found" containerID="3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Oct 14 13:24:45.698717 master-2 kubenswrapper[4762]: E1014 13:24:45.698684 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f is running failed: container process not found" containerID="3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Oct 14 13:24:45.699135 master-2 kubenswrapper[4762]: E1014 13:24:45.699064 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f is running failed: container process not found" containerID="3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Oct 14 13:24:45.699217 master-2 kubenswrapper[4762]: E1014 13:24:45.699173 4762 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="prometheus" Oct 14 13:24:45.845822 master-2 kubenswrapper[4762]: I1014 13:24:45.845772 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:45.901185 master-2 kubenswrapper[4762]: I1014 13:24:45.901116 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:45.906701 master-2 kubenswrapper[4762]: I1014 13:24:45.906609 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:24:46.002453 master-2 kubenswrapper[4762]: I1014 13:24:46.002330 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002453 master-2 kubenswrapper[4762]: I1014 13:24:46.002381 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002453 master-2 kubenswrapper[4762]: I1014 13:24:46.002436 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002467 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9xgk\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002502 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002524 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002549 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002573 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002595 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002623 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002649 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002673 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002694 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.002726 master-2 kubenswrapper[4762]: I1014 13:24:46.002718 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.003664 master-2 kubenswrapper[4762]: I1014 13:24:46.002741 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.003664 master-2 kubenswrapper[4762]: I1014 13:24:46.002762 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.003664 master-2 kubenswrapper[4762]: I1014 13:24:46.002788 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle\") pod \"d766696a-7ad4-4921-b799-c65f51b60109\" (UID: \"d766696a-7ad4-4921-b799-c65f51b60109\") " Oct 14 13:24:46.003664 master-2 kubenswrapper[4762]: I1014 13:24:46.003036 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-db\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.003664 master-2 kubenswrapper[4762]: I1014 13:24:46.003318 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:24:46.004083 master-2 kubenswrapper[4762]: I1014 13:24:46.004065 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:24:46.005339 master-2 kubenswrapper[4762]: I1014 13:24:46.005243 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:24:46.005804 master-2 kubenswrapper[4762]: I1014 13:24:46.005746 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk" (OuterVolumeSpecName: "kube-api-access-c9xgk") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "kube-api-access-c9xgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:24:46.005915 master-2 kubenswrapper[4762]: I1014 13:24:46.005845 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.006709 master-2 kubenswrapper[4762]: I1014 13:24:46.006668 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.007033 master-2 kubenswrapper[4762]: I1014 13:24:46.006949 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.007090 master-2 kubenswrapper[4762]: I1014 13:24:46.007064 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.007384 master-2 kubenswrapper[4762]: I1014 13:24:46.007355 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out" (OuterVolumeSpecName: "config-out") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:24:46.007593 master-2 kubenswrapper[4762]: I1014 13:24:46.007573 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:24:46.007857 master-2 kubenswrapper[4762]: I1014 13:24:46.007833 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.007978 master-2 kubenswrapper[4762]: I1014 13:24:46.007873 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.008133 master-2 kubenswrapper[4762]: I1014 13:24:46.008102 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:24:46.008639 master-2 kubenswrapper[4762]: I1014 13:24:46.008610 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config" (OuterVolumeSpecName: "config") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.008899 master-2 kubenswrapper[4762]: I1014 13:24:46.008862 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:24:46.014889 master-2 kubenswrapper[4762]: I1014 13:24:46.014292 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.048738 master-2 kubenswrapper[4762]: I1014 13:24:46.048656 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config" (OuterVolumeSpecName: "web-config") pod "d766696a-7ad4-4921-b799-c65f51b60109" (UID: "d766696a-7ad4-4921-b799-c65f51b60109"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:24:46.104123 master-2 kubenswrapper[4762]: I1014 13:24:46.104047 4762 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104123 master-2 kubenswrapper[4762]: I1014 13:24:46.104103 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104123 master-2 kubenswrapper[4762]: I1014 13:24:46.104117 4762 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104123 master-2 kubenswrapper[4762]: I1014 13:24:46.104136 4762 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d766696a-7ad4-4921-b799-c65f51b60109-config-out\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104150 4762 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-web-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104180 4762 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-serving-certs-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104194 4762 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-kubelet-serving-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104207 4762 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-thanos-prometheus-http-client-file\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104221 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-k8s-rulefiles-0\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104232 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9xgk\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-kube-api-access-c9xgk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104244 4762 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-grpc-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104255 4762 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-configmap-metrics-client-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104271 4762 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104283 4762 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-metrics-client-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104294 4762 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d766696a-7ad4-4921-b799-c65f51b60109-tls-assets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104307 4762 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d766696a-7ad4-4921-b799-c65f51b60109-prometheus-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.104502 master-2 kubenswrapper[4762]: I1014 13:24:46.104322 4762 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d766696a-7ad4-4921-b799-c65f51b60109-secret-kube-rbac-proxy\") on node \"master-2\" DevicePath \"\"" Oct 14 13:24:46.559608 master-2 kubenswrapper[4762]: I1014 13:24:46.559495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d766696a-7ad4-4921-b799-c65f51b60109","Type":"ContainerDied","Data":"5ae154f5dd833b785fba37d75186a43bfdf1782fe12b291a92538dd025614985"} Oct 14 13:24:46.559608 master-2 kubenswrapper[4762]: I1014 13:24:46.559591 4762 scope.go:117] "RemoveContainer" containerID="572aea723a01273b8c89bc2d0322d30c45f4601cb9a1fb063f1ae4afb4dfc654" Oct 14 13:24:46.560661 master-2 kubenswrapper[4762]: I1014 13:24:46.559597 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:46.583654 master-2 kubenswrapper[4762]: I1014 13:24:46.583612 4762 scope.go:117] "RemoveContainer" containerID="8cb6e7f57317f98acf1f86c1f4624231ddf922c2321b7ca8f5ee56a1e4c66274" Oct 14 13:24:46.602977 master-2 kubenswrapper[4762]: I1014 13:24:46.602928 4762 scope.go:117] "RemoveContainer" containerID="9a307c52f472dbea0756ef79dc081e7c7f818a13fbf1f1667281a7a4465acb42" Oct 14 13:24:46.620554 master-2 kubenswrapper[4762]: I1014 13:24:46.620510 4762 scope.go:117] "RemoveContainer" containerID="68586ae70faec91fa7469971b38394705160337f7f18cb0eff9978037eda6496" Oct 14 13:24:46.638107 master-2 kubenswrapper[4762]: I1014 13:24:46.637260 4762 scope.go:117] "RemoveContainer" containerID="3b512fead26f68b391af64674e86d0e7db890a4f2e2def8b93b1f890a60f4978" Oct 14 13:24:46.654024 master-2 kubenswrapper[4762]: I1014 13:24:46.653908 4762 scope.go:117] "RemoveContainer" containerID="3bcfb13ff771400f754599e7faf9d389ed5a8a630307828d6aeb6fe4ccee077f" Oct 14 13:24:46.673985 master-2 kubenswrapper[4762]: I1014 13:24:46.673924 4762 scope.go:117] "RemoveContainer" containerID="ae7cfe6e99c05c0f56eb6f0718dd8851e4401afd6099b55e40eb235f29c7b3ce" Oct 14 13:24:46.849755 master-2 kubenswrapper[4762]: I1014 13:24:46.849595 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:47.007788 master-2 kubenswrapper[4762]: I1014 13:24:47.007675 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:47.075263 master-2 kubenswrapper[4762]: I1014 13:24:47.075201 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:47.075475 master-2 kubenswrapper[4762]: E1014 13:24:47.075438 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="init-config-reloader" Oct 14 13:24:47.075475 master-2 kubenswrapper[4762]: I1014 13:24:47.075454 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="init-config-reloader" Oct 14 13:24:47.075475 master-2 kubenswrapper[4762]: E1014 13:24:47.075468 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="thanos-sidecar" Oct 14 13:24:47.075475 master-2 kubenswrapper[4762]: I1014 13:24:47.075478 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="thanos-sidecar" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: E1014 13:24:47.075494 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="prometheus" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: I1014 13:24:47.075506 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="prometheus" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: E1014 13:24:47.075518 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-thanos" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: I1014 13:24:47.075526 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-thanos" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: E1014 13:24:47.075537 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-web" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: I1014 13:24:47.075545 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-web" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: E1014 13:24:47.075558 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: I1014 13:24:47.075566 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: E1014 13:24:47.075584 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="config-reloader" Oct 14 13:24:47.075625 master-2 kubenswrapper[4762]: I1014 13:24:47.075592 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="config-reloader" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075692 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="thanos-sidecar" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075707 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-thanos" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075724 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="prometheus" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075738 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="config-reloader" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075751 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy-web" Oct 14 13:24:47.075897 master-2 kubenswrapper[4762]: I1014 13:24:47.075762 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766696a-7ad4-4921-b799-c65f51b60109" containerName="kube-rbac-proxy" Oct 14 13:24:47.077560 master-2 kubenswrapper[4762]: I1014 13:24:47.077524 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.081898 master-2 kubenswrapper[4762]: I1014 13:24:47.081854 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Oct 14 13:24:47.084192 master-2 kubenswrapper[4762]: I1014 13:24:47.084117 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Oct 14 13:24:47.084510 master-2 kubenswrapper[4762]: I1014 13:24:47.084327 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Oct 14 13:24:47.084510 master-2 kubenswrapper[4762]: I1014 13:24:47.084444 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Oct 14 13:24:47.085255 master-2 kubenswrapper[4762]: I1014 13:24:47.084634 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-8klgi7r2728qp" Oct 14 13:24:47.085255 master-2 kubenswrapper[4762]: I1014 13:24:47.084756 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Oct 14 13:24:47.085730 master-2 kubenswrapper[4762]: I1014 13:24:47.085378 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Oct 14 13:24:47.085730 master-2 kubenswrapper[4762]: I1014 13:24:47.085602 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-dzg65" Oct 14 13:24:47.085791 master-2 kubenswrapper[4762]: I1014 13:24:47.085744 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Oct 14 13:24:47.085955 master-2 kubenswrapper[4762]: I1014 13:24:47.085915 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Oct 14 13:24:47.086062 master-2 kubenswrapper[4762]: I1014 13:24:47.086039 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Oct 14 13:24:47.089690 master-2 kubenswrapper[4762]: I1014 13:24:47.089329 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Oct 14 13:24:47.090947 master-2 kubenswrapper[4762]: I1014 13:24:47.090926 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Oct 14 13:24:47.217250 master-2 kubenswrapper[4762]: I1014 13:24:47.217200 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217250 master-2 kubenswrapper[4762]: I1014 13:24:47.217253 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217470 master-2 kubenswrapper[4762]: I1014 13:24:47.217292 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217470 master-2 kubenswrapper[4762]: I1014 13:24:47.217422 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217541 master-2 kubenswrapper[4762]: I1014 13:24:47.217467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217541 master-2 kubenswrapper[4762]: I1014 13:24:47.217499 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217541 master-2 kubenswrapper[4762]: I1014 13:24:47.217525 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217842 master-2 kubenswrapper[4762]: I1014 13:24:47.217550 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217887 master-2 kubenswrapper[4762]: I1014 13:24:47.217687 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217956 master-2 kubenswrapper[4762]: I1014 13:24:47.217934 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.217993 master-2 kubenswrapper[4762]: I1014 13:24:47.217976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218023 master-2 kubenswrapper[4762]: I1014 13:24:47.218006 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218098 master-2 kubenswrapper[4762]: I1014 13:24:47.218072 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218140 master-2 kubenswrapper[4762]: I1014 13:24:47.218116 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9j9\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-kube-api-access-tm9j9\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218326 master-2 kubenswrapper[4762]: I1014 13:24:47.218141 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218421 master-2 kubenswrapper[4762]: I1014 13:24:47.218399 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218471 master-2 kubenswrapper[4762]: I1014 13:24:47.218437 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.218508 master-2 kubenswrapper[4762]: I1014 13:24:47.218485 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.225753 master-2 kubenswrapper[4762]: I1014 13:24:47.225714 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:47.319558 master-2 kubenswrapper[4762]: I1014 13:24:47.319439 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319558 master-2 kubenswrapper[4762]: I1014 13:24:47.319534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319558 master-2 kubenswrapper[4762]: I1014 13:24:47.319562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319590 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319621 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319647 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319704 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319757 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319780 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319808 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9j9\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-kube-api-access-tm9j9\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319861 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319908 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319930 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.319954 master-2 kubenswrapper[4762]: I1014 13:24:47.319955 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.320608 master-2 kubenswrapper[4762]: I1014 13:24:47.319983 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.320608 master-2 kubenswrapper[4762]: I1014 13:24:47.320007 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.322543 master-2 kubenswrapper[4762]: I1014 13:24:47.322488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.322764 master-2 kubenswrapper[4762]: I1014 13:24:47.322555 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.322764 master-2 kubenswrapper[4762]: I1014 13:24:47.322555 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.322881 master-2 kubenswrapper[4762]: I1014 13:24:47.322757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.323119 master-2 kubenswrapper[4762]: I1014 13:24:47.323082 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.325166 master-2 kubenswrapper[4762]: I1014 13:24:47.325096 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.325701 master-2 kubenswrapper[4762]: I1014 13:24:47.325457 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.325701 master-2 kubenswrapper[4762]: I1014 13:24:47.325474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-web-config\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.326437 master-2 kubenswrapper[4762]: I1014 13:24:47.326408 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.326863 master-2 kubenswrapper[4762]: I1014 13:24:47.326757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.327469 master-2 kubenswrapper[4762]: I1014 13:24:47.327432 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.327664 master-2 kubenswrapper[4762]: I1014 13:24:47.327562 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.327799 master-2 kubenswrapper[4762]: I1014 13:24:47.327756 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.327995 master-2 kubenswrapper[4762]: I1014 13:24:47.327955 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-config-out\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.328512 master-2 kubenswrapper[4762]: I1014 13:24:47.328474 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.330906 master-2 kubenswrapper[4762]: I1014 13:24:47.330849 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.332398 master-2 kubenswrapper[4762]: I1014 13:24:47.332357 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.487670 master-2 kubenswrapper[4762]: I1014 13:24:47.487453 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9j9\" (UniqueName: \"kubernetes.io/projected/9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80-kube-api-access-tm9j9\") pod \"prometheus-k8s-0\" (UID: \"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80\") " pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:47.557735 master-2 kubenswrapper[4762]: I1014 13:24:47.557656 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d766696a-7ad4-4921-b799-c65f51b60109" path="/var/lib/kubelet/pods/d766696a-7ad4-4921-b799-c65f51b60109/volumes" Oct 14 13:24:47.697691 master-2 kubenswrapper[4762]: I1014 13:24:47.697625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:48.130469 master-2 kubenswrapper[4762]: I1014 13:24:48.130410 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Oct 14 13:24:48.135134 master-2 kubenswrapper[4762]: W1014 13:24:48.135082 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c6adb4c_bde4_41b5_a5e7_225aa1d7ef80.slice/crio-b14d0fbc15f2033664fb5f556d514ffd634de3c0ba3739157746a5e7f64fa68a WatchSource:0}: Error finding container b14d0fbc15f2033664fb5f556d514ffd634de3c0ba3739157746a5e7f64fa68a: Status 404 returned error can't find the container with id b14d0fbc15f2033664fb5f556d514ffd634de3c0ba3739157746a5e7f64fa68a Oct 14 13:24:48.585886 master-2 kubenswrapper[4762]: I1014 13:24:48.585822 4762 generic.go:334] "Generic (PLEG): container finished" podID="9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80" containerID="15557e4de5f004be938a16716e51a202af85301bb51be6821c3bdd3a1e9fc2ea" exitCode=0 Oct 14 13:24:48.585886 master-2 kubenswrapper[4762]: I1014 13:24:48.585890 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerDied","Data":"15557e4de5f004be938a16716e51a202af85301bb51be6821c3bdd3a1e9fc2ea"} Oct 14 13:24:48.586362 master-2 kubenswrapper[4762]: I1014 13:24:48.585933 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"b14d0fbc15f2033664fb5f556d514ffd634de3c0ba3739157746a5e7f64fa68a"} Oct 14 13:24:49.595020 master-2 kubenswrapper[4762]: I1014 13:24:49.594826 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"eecfae9019ee16b5582f4ad04b34ec47c6645f24cc37b3a0bb50f1bb4cd04c86"} Oct 14 13:24:49.595020 master-2 kubenswrapper[4762]: I1014 13:24:49.594935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"8b1e9bec3e4ac77f097c1bbee0f230a09140a30851b16c1f6ffe7acf7d56326c"} Oct 14 13:24:49.595020 master-2 kubenswrapper[4762]: I1014 13:24:49.594961 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"ed45b0b19823f5891fc9ca7d2b94ca345da5e8f2b8d6cacec6457e32edc289dc"} Oct 14 13:24:49.595020 master-2 kubenswrapper[4762]: I1014 13:24:49.594982 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"30d70c0de0d2c43ec4c76db3bf734b618af896954b1edbeb52570749b6ca0d61"} Oct 14 13:24:50.610906 master-2 kubenswrapper[4762]: I1014 13:24:50.610769 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"dab8c453c6a871662f7a309bb2e7d7a000757170fb77e4f777b2e7f1c3d641bf"} Oct 14 13:24:50.610906 master-2 kubenswrapper[4762]: I1014 13:24:50.610839 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80","Type":"ContainerStarted","Data":"f68c78eac407936446625f044b539475fa49fba16d3537ecdb82d5af085e19de"} Oct 14 13:24:50.678468 master-2 kubenswrapper[4762]: I1014 13:24:50.676041 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.6760108860000003 podStartE2EDuration="3.676010886s" podCreationTimestamp="2025-10-14 13:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:24:50.670681305 +0000 UTC m=+1119.914840524" watchObservedRunningTime="2025-10-14 13:24:50.676010886 +0000 UTC m=+1119.920170085" Oct 14 13:24:52.698906 master-2 kubenswrapper[4762]: I1014 13:24:52.698784 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:24:54.737649 master-2 kubenswrapper[4762]: I1014 13:24:54.737395 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:54.737649 master-2 kubenswrapper[4762]: I1014 13:24:54.737469 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:54.742080 master-2 kubenswrapper[4762]: I1014 13:24:54.742033 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:24:55.660200 master-2 kubenswrapper[4762]: I1014 13:24:55.660115 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:25:01.218752 master-2 kubenswrapper[4762]: I1014 13:25:01.218654 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-6-master-2"] Oct 14 13:25:01.221037 master-2 kubenswrapper[4762]: I1014 13:25:01.220580 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.225267 master-2 kubenswrapper[4762]: I1014 13:25:01.225213 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-sdwrm" Oct 14 13:25:01.234875 master-2 kubenswrapper[4762]: I1014 13:25:01.234760 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-2"] Oct 14 13:25:01.330273 master-2 kubenswrapper[4762]: I1014 13:25:01.327007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.330273 master-2 kubenswrapper[4762]: I1014 13:25:01.327095 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.330273 master-2 kubenswrapper[4762]: I1014 13:25:01.327396 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.429186 master-2 kubenswrapper[4762]: I1014 13:25:01.429045 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.429186 master-2 kubenswrapper[4762]: I1014 13:25:01.429184 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.429571 master-2 kubenswrapper[4762]: I1014 13:25:01.429292 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.429652 master-2 kubenswrapper[4762]: I1014 13:25:01.429606 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.429723 master-2 kubenswrapper[4762]: I1014 13:25:01.429695 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.464181 master-2 kubenswrapper[4762]: I1014 13:25:01.464119 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access\") pod \"installer-6-master-2\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:01.550462 master-2 kubenswrapper[4762]: I1014 13:25:01.550317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:02.026465 master-2 kubenswrapper[4762]: W1014 13:25:02.026359 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3df39991_f6e7_40dd_b303_eee090ebced0.slice/crio-9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d WatchSource:0}: Error finding container 9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d: Status 404 returned error can't find the container with id 9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d Oct 14 13:25:02.027859 master-2 kubenswrapper[4762]: I1014 13:25:02.027828 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-2"] Oct 14 13:25:02.712945 master-2 kubenswrapper[4762]: I1014 13:25:02.712786 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-2" event={"ID":"3df39991-f6e7-40dd-b303-eee090ebced0","Type":"ContainerStarted","Data":"d197fda0cf57ebb7aca86efc3270955fa9297771fc31fa627b4aa26ef69a6af9"} Oct 14 13:25:02.712945 master-2 kubenswrapper[4762]: I1014 13:25:02.712941 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-2" event={"ID":"3df39991-f6e7-40dd-b303-eee090ebced0","Type":"ContainerStarted","Data":"9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d"} Oct 14 13:25:02.735858 master-2 kubenswrapper[4762]: I1014 13:25:02.735786 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-6-master-2" podStartSLOduration=1.7357659189999999 podStartE2EDuration="1.735765919s" podCreationTimestamp="2025-10-14 13:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:25:02.733759865 +0000 UTC m=+1131.977919034" watchObservedRunningTime="2025-10-14 13:25:02.735765919 +0000 UTC m=+1131.979925078" Oct 14 13:25:08.556066 master-2 kubenswrapper[4762]: I1014 13:25:08.555975 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-2"] Oct 14 13:25:08.557728 master-2 kubenswrapper[4762]: I1014 13:25:08.557037 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.560803 master-2 kubenswrapper[4762]: I1014 13:25:08.560715 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-p7d8w" Oct 14 13:25:08.578514 master-2 kubenswrapper[4762]: I1014 13:25:08.578440 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-2"] Oct 14 13:25:08.744433 master-2 kubenswrapper[4762]: I1014 13:25:08.744339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.744716 master-2 kubenswrapper[4762]: I1014 13:25:08.744471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.744951 master-2 kubenswrapper[4762]: I1014 13:25:08.744867 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.848635 master-2 kubenswrapper[4762]: I1014 13:25:08.848416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.848635 master-2 kubenswrapper[4762]: I1014 13:25:08.848548 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.849037 master-2 kubenswrapper[4762]: I1014 13:25:08.848635 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.849037 master-2 kubenswrapper[4762]: I1014 13:25:08.848715 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.849037 master-2 kubenswrapper[4762]: I1014 13:25:08.848822 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.884500 master-2 kubenswrapper[4762]: I1014 13:25:08.884418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access\") pod \"installer-6-master-2\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:08.925646 master-2 kubenswrapper[4762]: I1014 13:25:08.925494 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:09.367007 master-2 kubenswrapper[4762]: I1014 13:25:09.366953 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-2"] Oct 14 13:25:09.369969 master-2 kubenswrapper[4762]: W1014 13:25:09.369900 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6c8aa71b_6051_407f_9cfa_b25f931ed568.slice/crio-bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0 WatchSource:0}: Error finding container bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0: Status 404 returned error can't find the container with id bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0 Oct 14 13:25:09.757793 master-2 kubenswrapper[4762]: I1014 13:25:09.757700 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-2" event={"ID":"6c8aa71b-6051-407f-9cfa-b25f931ed568","Type":"ContainerStarted","Data":"bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0"} Oct 14 13:25:10.767959 master-2 kubenswrapper[4762]: I1014 13:25:10.767743 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-2" event={"ID":"6c8aa71b-6051-407f-9cfa-b25f931ed568","Type":"ContainerStarted","Data":"2f922305b172b9f374e6b418b0a76088fac2726a159c6e0ea739594613f99e23"} Oct 14 13:25:10.794301 master-2 kubenswrapper[4762]: I1014 13:25:10.794123 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-2" podStartSLOduration=2.794092181 podStartE2EDuration="2.794092181s" podCreationTimestamp="2025-10-14 13:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:25:10.793233745 +0000 UTC m=+1140.037392904" watchObservedRunningTime="2025-10-14 13:25:10.794092181 +0000 UTC m=+1140.038251380" Oct 14 13:25:35.494807 master-2 kubenswrapper[4762]: I1014 13:25:35.494700 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:25:35.495632 master-2 kubenswrapper[4762]: I1014 13:25:35.495132 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager" containerID="cri-o://594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" gracePeriod=30 Oct 14 13:25:35.495632 master-2 kubenswrapper[4762]: I1014 13:25:35.495316 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="cluster-policy-controller" containerID="cri-o://dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" gracePeriod=30 Oct 14 13:25:35.495632 master-2 kubenswrapper[4762]: I1014 13:25:35.495317 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" gracePeriod=30 Oct 14 13:25:35.496320 master-2 kubenswrapper[4762]: I1014 13:25:35.495933 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:25:35.496372 master-2 kubenswrapper[4762]: E1014 13:25:35.496315 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager" Oct 14 13:25:35.496372 master-2 kubenswrapper[4762]: I1014 13:25:35.496340 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager" Oct 14 13:25:35.496372 master-2 kubenswrapper[4762]: E1014 13:25:35.496364 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-cert-syncer" Oct 14 13:25:35.496509 master-2 kubenswrapper[4762]: I1014 13:25:35.496375 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-cert-syncer" Oct 14 13:25:35.496509 master-2 kubenswrapper[4762]: E1014 13:25:35.496396 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="cluster-policy-controller" Oct 14 13:25:35.496509 master-2 kubenswrapper[4762]: I1014 13:25:35.496406 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="cluster-policy-controller" Oct 14 13:25:35.496509 master-2 kubenswrapper[4762]: E1014 13:25:35.496419 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-recovery-controller" Oct 14 13:25:35.496509 master-2 kubenswrapper[4762]: I1014 13:25:35.496427 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-recovery-controller" Oct 14 13:25:35.496717 master-2 kubenswrapper[4762]: I1014 13:25:35.496558 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="cluster-policy-controller" Oct 14 13:25:35.496717 master-2 kubenswrapper[4762]: I1014 13:25:35.496575 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager" Oct 14 13:25:35.496717 master-2 kubenswrapper[4762]: I1014 13:25:35.496599 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-recovery-controller" Oct 14 13:25:35.496717 master-2 kubenswrapper[4762]: I1014 13:25:35.496614 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-cert-syncer" Oct 14 13:25:35.497375 master-2 kubenswrapper[4762]: I1014 13:25:35.497256 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="d9e75646502e68dc8cb077ea618d4d9d" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" gracePeriod=30 Oct 14 13:25:35.636574 master-2 kubenswrapper[4762]: I1014 13:25:35.636490 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.636759 master-2 kubenswrapper[4762]: I1014 13:25:35.636587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.672682 master-2 kubenswrapper[4762]: I1014 13:25:35.672631 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_d9e75646502e68dc8cb077ea618d4d9d/kube-controller-manager-cert-syncer/0.log" Oct 14 13:25:35.673751 master-2 kubenswrapper[4762]: I1014 13:25:35.673711 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.679745 master-2 kubenswrapper[4762]: I1014 13:25:35.679695 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="d9e75646502e68dc8cb077ea618d4d9d" podUID="e90af64e4889c9dc18456b228c1a089c" Oct 14 13:25:35.738358 master-2 kubenswrapper[4762]: I1014 13:25:35.738277 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.738358 master-2 kubenswrapper[4762]: I1014 13:25:35.738331 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.738743 master-2 kubenswrapper[4762]: I1014 13:25:35.738483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-cert-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.738743 master-2 kubenswrapper[4762]: I1014 13:25:35.738524 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e90af64e4889c9dc18456b228c1a089c-resource-dir\") pod \"kube-controller-manager-master-2\" (UID: \"e90af64e4889c9dc18456b228c1a089c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.839866 master-2 kubenswrapper[4762]: I1014 13:25:35.839709 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir\") pod \"d9e75646502e68dc8cb077ea618d4d9d\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " Oct 14 13:25:35.840216 master-2 kubenswrapper[4762]: I1014 13:25:35.839929 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir\") pod \"d9e75646502e68dc8cb077ea618d4d9d\" (UID: \"d9e75646502e68dc8cb077ea618d4d9d\") " Oct 14 13:25:35.840417 master-2 kubenswrapper[4762]: I1014 13:25:35.840389 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "d9e75646502e68dc8cb077ea618d4d9d" (UID: "d9e75646502e68dc8cb077ea618d4d9d"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:35.840473 master-2 kubenswrapper[4762]: I1014 13:25:35.840395 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "d9e75646502e68dc8cb077ea618d4d9d" (UID: "d9e75646502e68dc8cb077ea618d4d9d"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:35.941337 master-2 kubenswrapper[4762]: I1014 13:25:35.941274 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:35.941337 master-2 kubenswrapper[4762]: I1014 13:25:35.941325 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d9e75646502e68dc8cb077ea618d4d9d-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:35.944739 master-2 kubenswrapper[4762]: I1014 13:25:35.944707 4762 generic.go:334] "Generic (PLEG): container finished" podID="3df39991-f6e7-40dd-b303-eee090ebced0" containerID="d197fda0cf57ebb7aca86efc3270955fa9297771fc31fa627b4aa26ef69a6af9" exitCode=0 Oct 14 13:25:35.944848 master-2 kubenswrapper[4762]: I1014 13:25:35.944782 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-2" event={"ID":"3df39991-f6e7-40dd-b303-eee090ebced0","Type":"ContainerDied","Data":"d197fda0cf57ebb7aca86efc3270955fa9297771fc31fa627b4aa26ef69a6af9"} Oct 14 13:25:35.947355 master-2 kubenswrapper[4762]: I1014 13:25:35.947182 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-2_d9e75646502e68dc8cb077ea618d4d9d/kube-controller-manager-cert-syncer/0.log" Oct 14 13:25:35.948909 master-2 kubenswrapper[4762]: I1014 13:25:35.948870 4762 generic.go:334] "Generic (PLEG): container finished" podID="d9e75646502e68dc8cb077ea618d4d9d" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" exitCode=0 Oct 14 13:25:35.948909 master-2 kubenswrapper[4762]: I1014 13:25:35.948895 4762 generic.go:334] "Generic (PLEG): container finished" podID="d9e75646502e68dc8cb077ea618d4d9d" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" exitCode=2 Oct 14 13:25:35.948909 master-2 kubenswrapper[4762]: I1014 13:25:35.948909 4762 generic.go:334] "Generic (PLEG): container finished" podID="d9e75646502e68dc8cb077ea618d4d9d" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" exitCode=0 Oct 14 13:25:35.949036 master-2 kubenswrapper[4762]: I1014 13:25:35.948913 4762 scope.go:117] "RemoveContainer" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:35.949036 master-2 kubenswrapper[4762]: I1014 13:25:35.948920 4762 generic.go:334] "Generic (PLEG): container finished" podID="d9e75646502e68dc8cb077ea618d4d9d" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" exitCode=0 Oct 14 13:25:35.949843 master-2 kubenswrapper[4762]: I1014 13:25:35.949795 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:35.976078 master-2 kubenswrapper[4762]: I1014 13:25:35.974413 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="d9e75646502e68dc8cb077ea618d4d9d" podUID="e90af64e4889c9dc18456b228c1a089c" Oct 14 13:25:35.977345 master-2 kubenswrapper[4762]: I1014 13:25:35.976627 4762 scope.go:117] "RemoveContainer" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:35.984314 master-2 kubenswrapper[4762]: I1014 13:25:35.984257 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" oldPodUID="d9e75646502e68dc8cb077ea618d4d9d" podUID="e90af64e4889c9dc18456b228c1a089c" Oct 14 13:25:35.998950 master-2 kubenswrapper[4762]: I1014 13:25:35.998907 4762 scope.go:117] "RemoveContainer" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.013493 master-2 kubenswrapper[4762]: I1014 13:25:36.013450 4762 scope.go:117] "RemoveContainer" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.026337 master-2 kubenswrapper[4762]: I1014 13:25:36.026292 4762 scope.go:117] "RemoveContainer" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:36.026787 master-2 kubenswrapper[4762]: E1014 13:25:36.026748 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": container with ID starting with 73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f not found: ID does not exist" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:36.026839 master-2 kubenswrapper[4762]: I1014 13:25:36.026788 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f"} err="failed to get container status \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": rpc error: code = NotFound desc = could not find container \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": container with ID starting with 73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f not found: ID does not exist" Oct 14 13:25:36.026839 master-2 kubenswrapper[4762]: I1014 13:25:36.026819 4762 scope.go:117] "RemoveContainer" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:36.027264 master-2 kubenswrapper[4762]: E1014 13:25:36.027227 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": container with ID starting with 84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a not found: ID does not exist" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:36.027313 master-2 kubenswrapper[4762]: I1014 13:25:36.027266 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a"} err="failed to get container status \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": rpc error: code = NotFound desc = could not find container \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": container with ID starting with 84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a not found: ID does not exist" Oct 14 13:25:36.027313 master-2 kubenswrapper[4762]: I1014 13:25:36.027289 4762 scope.go:117] "RemoveContainer" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.027605 master-2 kubenswrapper[4762]: E1014 13:25:36.027565 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": container with ID starting with dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f not found: ID does not exist" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.027651 master-2 kubenswrapper[4762]: I1014 13:25:36.027598 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f"} err="failed to get container status \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": rpc error: code = NotFound desc = could not find container \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": container with ID starting with dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f not found: ID does not exist" Oct 14 13:25:36.027651 master-2 kubenswrapper[4762]: I1014 13:25:36.027618 4762 scope.go:117] "RemoveContainer" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.028096 master-2 kubenswrapper[4762]: E1014 13:25:36.028050 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": container with ID starting with 594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a not found: ID does not exist" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.028145 master-2 kubenswrapper[4762]: I1014 13:25:36.028107 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a"} err="failed to get container status \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": rpc error: code = NotFound desc = could not find container \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": container with ID starting with 594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a not found: ID does not exist" Oct 14 13:25:36.028241 master-2 kubenswrapper[4762]: I1014 13:25:36.028145 4762 scope.go:117] "RemoveContainer" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:36.028936 master-2 kubenswrapper[4762]: I1014 13:25:36.028881 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f"} err="failed to get container status \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": rpc error: code = NotFound desc = could not find container \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": container with ID starting with 73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f not found: ID does not exist" Oct 14 13:25:36.028936 master-2 kubenswrapper[4762]: I1014 13:25:36.028933 4762 scope.go:117] "RemoveContainer" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:36.029273 master-2 kubenswrapper[4762]: I1014 13:25:36.029233 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a"} err="failed to get container status \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": rpc error: code = NotFound desc = could not find container \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": container with ID starting with 84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a not found: ID does not exist" Oct 14 13:25:36.029273 master-2 kubenswrapper[4762]: I1014 13:25:36.029262 4762 scope.go:117] "RemoveContainer" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.029586 master-2 kubenswrapper[4762]: I1014 13:25:36.029543 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f"} err="failed to get container status \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": rpc error: code = NotFound desc = could not find container \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": container with ID starting with dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f not found: ID does not exist" Oct 14 13:25:36.029586 master-2 kubenswrapper[4762]: I1014 13:25:36.029578 4762 scope.go:117] "RemoveContainer" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.029907 master-2 kubenswrapper[4762]: I1014 13:25:36.029848 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a"} err="failed to get container status \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": rpc error: code = NotFound desc = could not find container \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": container with ID starting with 594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a not found: ID does not exist" Oct 14 13:25:36.029907 master-2 kubenswrapper[4762]: I1014 13:25:36.029906 4762 scope.go:117] "RemoveContainer" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:36.030403 master-2 kubenswrapper[4762]: I1014 13:25:36.030359 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f"} err="failed to get container status \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": rpc error: code = NotFound desc = could not find container \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": container with ID starting with 73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f not found: ID does not exist" Oct 14 13:25:36.030403 master-2 kubenswrapper[4762]: I1014 13:25:36.030392 4762 scope.go:117] "RemoveContainer" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:36.030844 master-2 kubenswrapper[4762]: I1014 13:25:36.030800 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a"} err="failed to get container status \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": rpc error: code = NotFound desc = could not find container \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": container with ID starting with 84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a not found: ID does not exist" Oct 14 13:25:36.030844 master-2 kubenswrapper[4762]: I1014 13:25:36.030834 4762 scope.go:117] "RemoveContainer" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.031187 master-2 kubenswrapper[4762]: I1014 13:25:36.031140 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f"} err="failed to get container status \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": rpc error: code = NotFound desc = could not find container \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": container with ID starting with dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f not found: ID does not exist" Oct 14 13:25:36.031187 master-2 kubenswrapper[4762]: I1014 13:25:36.031183 4762 scope.go:117] "RemoveContainer" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.031513 master-2 kubenswrapper[4762]: I1014 13:25:36.031471 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a"} err="failed to get container status \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": rpc error: code = NotFound desc = could not find container \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": container with ID starting with 594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a not found: ID does not exist" Oct 14 13:25:36.031513 master-2 kubenswrapper[4762]: I1014 13:25:36.031502 4762 scope.go:117] "RemoveContainer" containerID="73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f" Oct 14 13:25:36.031801 master-2 kubenswrapper[4762]: I1014 13:25:36.031753 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f"} err="failed to get container status \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": rpc error: code = NotFound desc = could not find container \"73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f\": container with ID starting with 73d790cae663d066b0748ccdf6e566727eb277211eb0f8e2f2dd1d6e3b29222f not found: ID does not exist" Oct 14 13:25:36.031801 master-2 kubenswrapper[4762]: I1014 13:25:36.031795 4762 scope.go:117] "RemoveContainer" containerID="84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a" Oct 14 13:25:36.032095 master-2 kubenswrapper[4762]: I1014 13:25:36.032051 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a"} err="failed to get container status \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": rpc error: code = NotFound desc = could not find container \"84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a\": container with ID starting with 84661648010a88ef48bd3be91f2082ee60333a193279601551124734c4156c2a not found: ID does not exist" Oct 14 13:25:36.032095 master-2 kubenswrapper[4762]: I1014 13:25:36.032084 4762 scope.go:117] "RemoveContainer" containerID="dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f" Oct 14 13:25:36.032442 master-2 kubenswrapper[4762]: I1014 13:25:36.032403 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f"} err="failed to get container status \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": rpc error: code = NotFound desc = could not find container \"dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f\": container with ID starting with dd26e7b8f23f0484780c30a5413c0a14f4efd95fd1a64f8e659ce1f8a0e33d5f not found: ID does not exist" Oct 14 13:25:36.032442 master-2 kubenswrapper[4762]: I1014 13:25:36.032431 4762 scope.go:117] "RemoveContainer" containerID="594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a" Oct 14 13:25:36.032893 master-2 kubenswrapper[4762]: I1014 13:25:36.032854 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a"} err="failed to get container status \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": rpc error: code = NotFound desc = could not find container \"594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a\": container with ID starting with 594ef0c259c88ddca5ddad13d819171fe7c0fc7d2413db1ecce8c4324b3ac47a not found: ID does not exist" Oct 14 13:25:37.413008 master-2 kubenswrapper[4762]: I1014 13:25:37.412946 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:37.562435 master-2 kubenswrapper[4762]: I1014 13:25:37.562254 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e75646502e68dc8cb077ea618d4d9d" path="/var/lib/kubelet/pods/d9e75646502e68dc8cb077ea618d4d9d/volumes" Oct 14 13:25:37.563004 master-2 kubenswrapper[4762]: I1014 13:25:37.562959 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock\") pod \"3df39991-f6e7-40dd-b303-eee090ebced0\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " Oct 14 13:25:37.563108 master-2 kubenswrapper[4762]: I1014 13:25:37.563076 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access\") pod \"3df39991-f6e7-40dd-b303-eee090ebced0\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " Oct 14 13:25:37.563243 master-2 kubenswrapper[4762]: I1014 13:25:37.563206 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir\") pod \"3df39991-f6e7-40dd-b303-eee090ebced0\" (UID: \"3df39991-f6e7-40dd-b303-eee090ebced0\") " Oct 14 13:25:37.563417 master-2 kubenswrapper[4762]: I1014 13:25:37.563375 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock" (OuterVolumeSpecName: "var-lock") pod "3df39991-f6e7-40dd-b303-eee090ebced0" (UID: "3df39991-f6e7-40dd-b303-eee090ebced0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:37.563600 master-2 kubenswrapper[4762]: I1014 13:25:37.563476 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3df39991-f6e7-40dd-b303-eee090ebced0" (UID: "3df39991-f6e7-40dd-b303-eee090ebced0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:37.567467 master-2 kubenswrapper[4762]: I1014 13:25:37.567388 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3df39991-f6e7-40dd-b303-eee090ebced0" (UID: "3df39991-f6e7-40dd-b303-eee090ebced0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:25:37.665047 master-2 kubenswrapper[4762]: I1014 13:25:37.664985 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:37.665047 master-2 kubenswrapper[4762]: I1014 13:25:37.665036 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3df39991-f6e7-40dd-b303-eee090ebced0-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:37.665047 master-2 kubenswrapper[4762]: I1014 13:25:37.665048 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3df39991-f6e7-40dd-b303-eee090ebced0-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:37.968687 master-2 kubenswrapper[4762]: I1014 13:25:37.968555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-2" event={"ID":"3df39991-f6e7-40dd-b303-eee090ebced0","Type":"ContainerDied","Data":"9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d"} Oct 14 13:25:37.968935 master-2 kubenswrapper[4762]: I1014 13:25:37.968916 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2defa1a2c4e23213486e4327691c8e1fce459fbaf4d0a220aa1547bbe1655d" Oct 14 13:25:37.969026 master-2 kubenswrapper[4762]: I1014 13:25:37.968664 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-2" Oct 14 13:25:38.847499 master-2 kubenswrapper[4762]: I1014 13:25:38.847436 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:38.848144 master-2 kubenswrapper[4762]: I1014 13:25:38.847505 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:43.847242 master-2 kubenswrapper[4762]: I1014 13:25:43.847128 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:43.848005 master-2 kubenswrapper[4762]: I1014 13:25:43.847279 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:47.549911 master-2 kubenswrapper[4762]: I1014 13:25:47.549814 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:47.570732 master-2 kubenswrapper[4762]: I1014 13:25:47.570668 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="1f6576bd-39aa-4525-9b23-692ea010d814" Oct 14 13:25:47.570732 master-2 kubenswrapper[4762]: I1014 13:25:47.570719 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="1f6576bd-39aa-4525-9b23-692ea010d814" Oct 14 13:25:47.599589 master-2 kubenswrapper[4762]: I1014 13:25:47.599491 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:25:47.650614 master-2 kubenswrapper[4762]: I1014 13:25:47.650559 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:47.651322 master-2 kubenswrapper[4762]: I1014 13:25:47.651233 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:25:47.682883 master-2 kubenswrapper[4762]: I1014 13:25:47.682800 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:47.690706 master-2 kubenswrapper[4762]: I1014 13:25:47.690598 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-2"] Oct 14 13:25:47.698669 master-2 kubenswrapper[4762]: I1014 13:25:47.698590 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:25:47.714179 master-2 kubenswrapper[4762]: W1014 13:25:47.714045 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90af64e4889c9dc18456b228c1a089c.slice/crio-0f9f86f3cc2eaeed72cc8c11fc677f8da0a68f81f6fadbb51a0fc2fbbbd698ec WatchSource:0}: Error finding container 0f9f86f3cc2eaeed72cc8c11fc677f8da0a68f81f6fadbb51a0fc2fbbbd698ec: Status 404 returned error can't find the container with id 0f9f86f3cc2eaeed72cc8c11fc677f8da0a68f81f6fadbb51a0fc2fbbbd698ec Oct 14 13:25:47.732368 master-2 kubenswrapper[4762]: I1014 13:25:47.732324 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:25:47.882359 master-2 kubenswrapper[4762]: I1014 13:25:47.882282 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:25:47.882934 master-2 kubenswrapper[4762]: I1014 13:25:47.882842 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver" containerID="cri-o://69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a" gracePeriod=135 Oct 14 13:25:47.883126 master-2 kubenswrapper[4762]: I1014 13:25:47.882955 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee" gracePeriod=135 Oct 14 13:25:47.883280 master-2 kubenswrapper[4762]: I1014 13:25:47.882989 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d" gracePeriod=135 Oct 14 13:25:47.883457 master-2 kubenswrapper[4762]: I1014 13:25:47.883023 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410" gracePeriod=135 Oct 14 13:25:47.883457 master-2 kubenswrapper[4762]: I1014 13:25:47.883050 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a" gracePeriod=135 Oct 14 13:25:47.884931 master-2 kubenswrapper[4762]: I1014 13:25:47.884768 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:25:47.885095 master-2 kubenswrapper[4762]: E1014 13:25:47.885061 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver" Oct 14 13:25:47.885095 master-2 kubenswrapper[4762]: I1014 13:25:47.885081 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885117 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="setup" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885126 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="setup" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885137 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-syncer" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885145 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-syncer" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885182 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df39991-f6e7-40dd-b303-eee090ebced0" containerName="installer" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885192 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df39991-f6e7-40dd-b303-eee090ebced0" containerName="installer" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885211 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-check-endpoints" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885218 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-check-endpoints" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885232 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885265 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: E1014 13:25:47.885278 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-insecure-readyz" Oct 14 13:25:47.885348 master-2 kubenswrapper[4762]: I1014 13:25:47.885285 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-insecure-readyz" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885429 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-regeneration-controller" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885444 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df39991-f6e7-40dd-b303-eee090ebced0" containerName="installer" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885451 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-insecure-readyz" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885462 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885471 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-check-endpoints" Oct 14 13:25:47.886230 master-2 kubenswrapper[4762]: I1014 13:25:47.885479 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9041570beb5002e8da158e70e12f0c16" containerName="kube-apiserver-cert-syncer" Oct 14 13:25:48.020961 master-2 kubenswrapper[4762]: I1014 13:25:48.020906 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.021077 master-2 kubenswrapper[4762]: I1014 13:25:48.020976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.021077 master-2 kubenswrapper[4762]: I1014 13:25:48.021051 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.043114 master-2 kubenswrapper[4762]: I1014 13:25:48.043051 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_9041570beb5002e8da158e70e12f0c16/kube-apiserver-cert-syncer/0.log" Oct 14 13:25:48.044131 master-2 kubenswrapper[4762]: I1014 13:25:48.044096 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee" exitCode=0 Oct 14 13:25:48.044131 master-2 kubenswrapper[4762]: I1014 13:25:48.044126 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d" exitCode=0 Oct 14 13:25:48.044258 master-2 kubenswrapper[4762]: I1014 13:25:48.044135 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410" exitCode=0 Oct 14 13:25:48.044258 master-2 kubenswrapper[4762]: I1014 13:25:48.044145 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a" exitCode=2 Oct 14 13:25:48.046066 master-2 kubenswrapper[4762]: I1014 13:25:48.045995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"e90af64e4889c9dc18456b228c1a089c","Type":"ContainerStarted","Data":"dcd38313ddcf7fceb996e4afc00ec7fa720da78cc85ce4bb335519fab0f46af0"} Oct 14 13:25:48.046169 master-2 kubenswrapper[4762]: I1014 13:25:48.046072 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"e90af64e4889c9dc18456b228c1a089c","Type":"ContainerStarted","Data":"0f9f86f3cc2eaeed72cc8c11fc677f8da0a68f81f6fadbb51a0fc2fbbbd698ec"} Oct 14 13:25:48.081367 master-2 kubenswrapper[4762]: I1014 13:25:48.081306 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Oct 14 13:25:48.122212 master-2 kubenswrapper[4762]: I1014 13:25:48.122096 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.122212 master-2 kubenswrapper[4762]: I1014 13:25:48.122183 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.122451 master-2 kubenswrapper[4762]: I1014 13:25:48.122230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.122451 master-2 kubenswrapper[4762]: I1014 13:25:48.122240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-audit-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.122451 master-2 kubenswrapper[4762]: I1014 13:25:48.122297 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-resource-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.122451 master-2 kubenswrapper[4762]: I1014 13:25:48.122328 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/978811670a28b21932e323b181b31435-cert-dir\") pod \"kube-apiserver-master-2\" (UID: \"978811670a28b21932e323b181b31435\") " pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:25:48.124058 master-2 kubenswrapper[4762]: I1014 13:25:48.124027 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="9041570beb5002e8da158e70e12f0c16" podUID="978811670a28b21932e323b181b31435" Oct 14 13:25:48.847102 master-2 kubenswrapper[4762]: I1014 13:25:48.847042 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:48.847102 master-2 kubenswrapper[4762]: I1014 13:25:48.847099 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:48.847848 master-2 kubenswrapper[4762]: I1014 13:25:48.847180 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:25:48.849805 master-2 kubenswrapper[4762]: I1014 13:25:48.848234 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:48.849805 master-2 kubenswrapper[4762]: I1014 13:25:48.848422 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: I1014 13:25:48.860366 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:25:48.860444 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:25:48.862453 master-2 kubenswrapper[4762]: I1014 13:25:48.860456 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:25:49.059621 master-2 kubenswrapper[4762]: I1014 13:25:49.059461 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"e90af64e4889c9dc18456b228c1a089c","Type":"ContainerStarted","Data":"776a743a8180442aafe8ad2965521b864cf573e1c2effc728b217189fb74f26b"} Oct 14 13:25:49.059621 master-2 kubenswrapper[4762]: I1014 13:25:49.059534 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"e90af64e4889c9dc18456b228c1a089c","Type":"ContainerStarted","Data":"5b459cb33b03cbd13d8adaf31d86b85aae7a7bd9ee7d1f8b342520a2ff5155c3"} Oct 14 13:25:49.059621 master-2 kubenswrapper[4762]: I1014 13:25:49.059555 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" event={"ID":"e90af64e4889c9dc18456b228c1a089c","Type":"ContainerStarted","Data":"9b896a17ad8c2e3e91493877956f61f9277e1b81fa4596da9fa10ceb88bf2d00"} Oct 14 13:25:49.064025 master-2 kubenswrapper[4762]: I1014 13:25:49.063978 4762 generic.go:334] "Generic (PLEG): container finished" podID="6c8aa71b-6051-407f-9cfa-b25f931ed568" containerID="2f922305b172b9f374e6b418b0a76088fac2726a159c6e0ea739594613f99e23" exitCode=0 Oct 14 13:25:49.064266 master-2 kubenswrapper[4762]: I1014 13:25:49.064040 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-2" event={"ID":"6c8aa71b-6051-407f-9cfa-b25f931ed568","Type":"ContainerDied","Data":"2f922305b172b9f374e6b418b0a76088fac2726a159c6e0ea739594613f99e23"} Oct 14 13:25:49.095855 master-2 kubenswrapper[4762]: I1014 13:25:49.095790 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podStartSLOduration=2.095770189 podStartE2EDuration="2.095770189s" podCreationTimestamp="2025-10-14 13:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:25:49.090670646 +0000 UTC m=+1178.334829815" watchObservedRunningTime="2025-10-14 13:25:49.095770189 +0000 UTC m=+1178.339929348" Oct 14 13:25:50.485121 master-2 kubenswrapper[4762]: I1014 13:25:50.485057 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:50.662494 master-2 kubenswrapper[4762]: I1014 13:25:50.662414 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access\") pod \"6c8aa71b-6051-407f-9cfa-b25f931ed568\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " Oct 14 13:25:50.662796 master-2 kubenswrapper[4762]: I1014 13:25:50.662646 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock\") pod \"6c8aa71b-6051-407f-9cfa-b25f931ed568\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " Oct 14 13:25:50.662796 master-2 kubenswrapper[4762]: I1014 13:25:50.662680 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir\") pod \"6c8aa71b-6051-407f-9cfa-b25f931ed568\" (UID: \"6c8aa71b-6051-407f-9cfa-b25f931ed568\") " Oct 14 13:25:50.662796 master-2 kubenswrapper[4762]: I1014 13:25:50.662721 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock" (OuterVolumeSpecName: "var-lock") pod "6c8aa71b-6051-407f-9cfa-b25f931ed568" (UID: "6c8aa71b-6051-407f-9cfa-b25f931ed568"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:50.663007 master-2 kubenswrapper[4762]: I1014 13:25:50.662924 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6c8aa71b-6051-407f-9cfa-b25f931ed568" (UID: "6c8aa71b-6051-407f-9cfa-b25f931ed568"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:25:50.663451 master-2 kubenswrapper[4762]: I1014 13:25:50.663208 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:50.663451 master-2 kubenswrapper[4762]: I1014 13:25:50.663238 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c8aa71b-6051-407f-9cfa-b25f931ed568-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:50.666497 master-2 kubenswrapper[4762]: I1014 13:25:50.666443 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6c8aa71b-6051-407f-9cfa-b25f931ed568" (UID: "6c8aa71b-6051-407f-9cfa-b25f931ed568"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:25:50.764125 master-2 kubenswrapper[4762]: I1014 13:25:50.764022 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c8aa71b-6051-407f-9cfa-b25f931ed568-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:25:51.079565 master-2 kubenswrapper[4762]: I1014 13:25:51.079403 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-2" event={"ID":"6c8aa71b-6051-407f-9cfa-b25f931ed568","Type":"ContainerDied","Data":"bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0"} Oct 14 13:25:51.079565 master-2 kubenswrapper[4762]: I1014 13:25:51.079449 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3e9d761a3920b380100da3892704783b6c4fd04048664e495453b42cdf67a0" Oct 14 13:25:51.079565 master-2 kubenswrapper[4762]: I1014 13:25:51.079517 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-2" Oct 14 13:25:53.847329 master-2 kubenswrapper[4762]: I1014 13:25:53.847266 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:53.847927 master-2 kubenswrapper[4762]: I1014 13:25:53.847332 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: I1014 13:25:53.859755 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:25:53.860106 master-2 kubenswrapper[4762]: I1014 13:25:53.859848 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:25:57.684505 master-2 kubenswrapper[4762]: I1014 13:25:57.684426 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:57.684505 master-2 kubenswrapper[4762]: I1014 13:25:57.684508 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:57.684505 master-2 kubenswrapper[4762]: I1014 13:25:57.684530 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:57.685645 master-2 kubenswrapper[4762]: I1014 13:25:57.684551 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:57.685645 master-2 kubenswrapper[4762]: I1014 13:25:57.684811 4762 patch_prober.go:28] interesting pod/kube-controller-manager-master-2 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:57.685645 master-2 kubenswrapper[4762]: I1014 13:25:57.684892 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" podUID="e90af64e4889c9dc18456b228c1a089c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:57.692899 master-2 kubenswrapper[4762]: I1014 13:25:57.692821 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:58.147068 master-2 kubenswrapper[4762]: I1014 13:25:58.146911 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:25:58.847431 master-2 kubenswrapper[4762]: I1014 13:25:58.847362 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:25:58.847913 master-2 kubenswrapper[4762]: I1014 13:25:58.847446 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: I1014 13:25:58.857254 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:25:58.857323 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:25:58.858577 master-2 kubenswrapper[4762]: I1014 13:25:58.857335 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:25:58.858577 master-2 kubenswrapper[4762]: I1014 13:25:58.857518 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: I1014 13:25:58.865389 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:25:58.865456 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:25:58.866645 master-2 kubenswrapper[4762]: I1014 13:25:58.865479 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:03.848146 master-2 kubenswrapper[4762]: I1014 13:26:03.848069 4762 patch_prober.go:28] interesting pod/kube-controller-manager-guard-master-2 container/guard namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" start-of-body= Oct 14 13:26:03.848973 master-2 kubenswrapper[4762]: I1014 13:26:03.848197 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" podUID="d43a34b7-69dd-43b0-8465-4e44cb687285" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:10257/healthz\": dial tcp 192.168.34.12:10257: connect: connection refused" Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: I1014 13:26:03.858061 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:03.858426 master-2 kubenswrapper[4762]: I1014 13:26:03.858133 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:07.691033 master-2 kubenswrapper[4762]: I1014 13:26:07.690931 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:26:07.699610 master-2 kubenswrapper[4762]: I1014 13:26:07.699508 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-2" Oct 14 13:26:08.853197 master-2 kubenswrapper[4762]: I1014 13:26:08.853049 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-guard-master-2" Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: I1014 13:26:08.858411 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:08.858490 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:08.860701 master-2 kubenswrapper[4762]: I1014 13:26:08.858488 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: I1014 13:26:13.860258 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:13.860347 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:13.863790 master-2 kubenswrapper[4762]: I1014 13:26:13.860372 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: I1014 13:26:18.857453 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:18.857520 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:18.861418 master-2 kubenswrapper[4762]: I1014 13:26:18.857554 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: I1014 13:26:23.857702 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:23.857780 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:23.859478 master-2 kubenswrapper[4762]: I1014 13:26:23.857804 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: I1014 13:26:28.859300 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:28.859375 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:28.862719 master-2 kubenswrapper[4762]: I1014 13:26:28.859396 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: I1014 13:26:33.859716 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:33.859805 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:33.863261 master-2 kubenswrapper[4762]: I1014 13:26:33.859854 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: I1014 13:26:38.856916 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:38.856968 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:38.859131 master-2 kubenswrapper[4762]: I1014 13:26:38.856997 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:43.608104 master-2 kubenswrapper[4762]: I1014 13:26:43.607400 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:26:43.609262 master-2 kubenswrapper[4762]: I1014 13:26:43.608465 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" containerID="cri-o://ba902ca859f0fcfe992aebe277dcc7b1ce0c63eee0caf6006314ea48d7bec6a3" gracePeriod=120 Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: I1014 13:26:43.856615 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:43.856679 master-2 kubenswrapper[4762]: I1014 13:26:43.856671 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: I1014 13:26:46.614978 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:46.615113 master-2 kubenswrapper[4762]: I1014 13:26:46.615102 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: I1014 13:26:48.856874 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:48.856960 master-2 kubenswrapper[4762]: I1014 13:26:48.856959 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: I1014 13:26:51.615385 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:51.615459 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:51.616591 master-2 kubenswrapper[4762]: I1014 13:26:51.615475 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: I1014 13:26:53.857046 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]api-openshift-apiserver-available ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]api-openshift-oauth-apiserver-available ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-filter ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-informers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiextensions-controllers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/crd-informer-synced ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-system-namespaces-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/rbac/bootstrap-roles ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/bootstrap-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/start-kube-aggregator-informers ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-registration-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-discovery-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]autoregister-completion ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapi-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:53.857107 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:53.858593 master-2 kubenswrapper[4762]: I1014 13:26:53.857114 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:54.708038 master-2 kubenswrapper[4762]: I1014 13:26:54.707929 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:26:54.708558 master-2 kubenswrapper[4762]: I1014 13:26:54.708400 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" containerID="cri-o://e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d" gracePeriod=120 Oct 14 13:26:54.708558 master-2 kubenswrapper[4762]: I1014 13:26:54.708489 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a" gracePeriod=120 Oct 14 13:26:55.530217 master-2 kubenswrapper[4762]: I1014 13:26:55.530114 4762 generic.go:334] "Generic (PLEG): container finished" podID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerID="d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a" exitCode=0 Oct 14 13:26:55.530217 master-2 kubenswrapper[4762]: I1014 13:26:55.530209 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerDied","Data":"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a"} Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: I1014 13:26:56.616600 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:56.616699 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:56.618407 master-2 kubenswrapper[4762]: I1014 13:26:56.616708 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:56.618407 master-2 kubenswrapper[4762]: I1014 13:26:56.616836 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: I1014 13:26:57.372430 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:26:57.372519 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:26:57.373775 master-2 kubenswrapper[4762]: I1014 13:26:57.372528 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:26:58.524202 master-2 kubenswrapper[4762]: I1014 13:26:58.524006 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-10-master-2"] Oct 14 13:26:58.525192 master-2 kubenswrapper[4762]: E1014 13:26:58.524439 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8aa71b-6051-407f-9cfa-b25f931ed568" containerName="installer" Oct 14 13:26:58.525192 master-2 kubenswrapper[4762]: I1014 13:26:58.524476 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8aa71b-6051-407f-9cfa-b25f931ed568" containerName="installer" Oct 14 13:26:58.525192 master-2 kubenswrapper[4762]: I1014 13:26:58.524599 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8aa71b-6051-407f-9cfa-b25f931ed568" containerName="installer" Oct 14 13:26:58.525192 master-2 kubenswrapper[4762]: I1014 13:26:58.525188 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.533433 master-2 kubenswrapper[4762]: I1014 13:26:58.533397 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbs2c" Oct 14 13:26:58.550814 master-2 kubenswrapper[4762]: I1014 13:26:58.550756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-10-master-2"] Oct 14 13:26:58.567536 master-2 kubenswrapper[4762]: I1014 13:26:58.567472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.567865 master-2 kubenswrapper[4762]: I1014 13:26:58.567579 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.567865 master-2 kubenswrapper[4762]: I1014 13:26:58.567677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.669831 master-2 kubenswrapper[4762]: I1014 13:26:58.669337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.669831 master-2 kubenswrapper[4762]: I1014 13:26:58.669416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.669831 master-2 kubenswrapper[4762]: I1014 13:26:58.669477 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.669831 master-2 kubenswrapper[4762]: I1014 13:26:58.669537 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.669831 master-2 kubenswrapper[4762]: I1014 13:26:58.669604 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.695179 master-2 kubenswrapper[4762]: I1014 13:26:58.695020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access\") pod \"installer-10-master-2\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:58.853900 master-2 kubenswrapper[4762]: I1014 13:26:58.853571 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" start-of-body= Oct 14 13:26:58.853900 master-2 kubenswrapper[4762]: I1014 13:26:58.853771 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" Oct 14 13:26:58.896316 master-2 kubenswrapper[4762]: I1014 13:26:58.895104 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-2" Oct 14 13:26:59.418749 master-2 kubenswrapper[4762]: I1014 13:26:59.418676 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-10-master-2"] Oct 14 13:26:59.427623 master-2 kubenswrapper[4762]: W1014 13:26:59.427282 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8558a2f_5ea7_42a3_b00d_2ffbb553f642.slice/crio-8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9 WatchSource:0}: Error finding container 8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9: Status 404 returned error can't find the container with id 8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9 Oct 14 13:26:59.568072 master-2 kubenswrapper[4762]: I1014 13:26:59.567968 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-2" event={"ID":"e8558a2f-5ea7-42a3-b00d-2ffbb553f642","Type":"ContainerStarted","Data":"8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9"} Oct 14 13:27:00.279771 master-2 kubenswrapper[4762]: I1014 13:27:00.279706 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_9041570beb5002e8da158e70e12f0c16/kube-apiserver-cert-syncer/0.log" Oct 14 13:27:00.280608 master-2 kubenswrapper[4762]: I1014 13:27:00.280569 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:00.286981 master-2 kubenswrapper[4762]: I1014 13:27:00.286927 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="9041570beb5002e8da158e70e12f0c16" podUID="978811670a28b21932e323b181b31435" Oct 14 13:27:00.391063 master-2 kubenswrapper[4762]: I1014 13:27:00.390967 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir\") pod \"9041570beb5002e8da158e70e12f0c16\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " Oct 14 13:27:00.391063 master-2 kubenswrapper[4762]: I1014 13:27:00.391080 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir\") pod \"9041570beb5002e8da158e70e12f0c16\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " Oct 14 13:27:00.391503 master-2 kubenswrapper[4762]: I1014 13:27:00.391125 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir\") pod \"9041570beb5002e8da158e70e12f0c16\" (UID: \"9041570beb5002e8da158e70e12f0c16\") " Oct 14 13:27:00.391503 master-2 kubenswrapper[4762]: I1014 13:27:00.391138 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "9041570beb5002e8da158e70e12f0c16" (UID: "9041570beb5002e8da158e70e12f0c16"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:00.391503 master-2 kubenswrapper[4762]: I1014 13:27:00.391290 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "9041570beb5002e8da158e70e12f0c16" (UID: "9041570beb5002e8da158e70e12f0c16"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:00.391503 master-2 kubenswrapper[4762]: I1014 13:27:00.391315 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9041570beb5002e8da158e70e12f0c16" (UID: "9041570beb5002e8da158e70e12f0c16"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:00.392113 master-2 kubenswrapper[4762]: I1014 13:27:00.392028 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:00.392113 master-2 kubenswrapper[4762]: I1014 13:27:00.392084 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:00.392113 master-2 kubenswrapper[4762]: I1014 13:27:00.392097 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9041570beb5002e8da158e70e12f0c16-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:00.580875 master-2 kubenswrapper[4762]: I1014 13:27:00.580760 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-2" event={"ID":"e8558a2f-5ea7-42a3-b00d-2ffbb553f642","Type":"ContainerStarted","Data":"090eaf196ae20f28142493f42c17816526e3b37829abd5890ab50883ad601d6b"} Oct 14 13:27:00.587885 master-2 kubenswrapper[4762]: I1014 13:27:00.587834 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-2_9041570beb5002e8da158e70e12f0c16/kube-apiserver-cert-syncer/0.log" Oct 14 13:27:00.589521 master-2 kubenswrapper[4762]: I1014 13:27:00.589453 4762 generic.go:334] "Generic (PLEG): container finished" podID="9041570beb5002e8da158e70e12f0c16" containerID="69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a" exitCode=0 Oct 14 13:27:00.589656 master-2 kubenswrapper[4762]: I1014 13:27:00.589542 4762 scope.go:117] "RemoveContainer" containerID="ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee" Oct 14 13:27:00.589656 master-2 kubenswrapper[4762]: I1014 13:27:00.589620 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:00.615448 master-2 kubenswrapper[4762]: I1014 13:27:00.615230 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-2" oldPodUID="9041570beb5002e8da158e70e12f0c16" podUID="978811670a28b21932e323b181b31435" Oct 14 13:27:00.615578 master-2 kubenswrapper[4762]: I1014 13:27:00.615515 4762 scope.go:117] "RemoveContainer" containerID="39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d" Oct 14 13:27:00.639439 master-2 kubenswrapper[4762]: I1014 13:27:00.635556 4762 scope.go:117] "RemoveContainer" containerID="582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410" Oct 14 13:27:00.655598 master-2 kubenswrapper[4762]: I1014 13:27:00.655537 4762 scope.go:117] "RemoveContainer" containerID="ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a" Oct 14 13:27:00.671049 master-2 kubenswrapper[4762]: I1014 13:27:00.671008 4762 scope.go:117] "RemoveContainer" containerID="69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a" Oct 14 13:27:00.686739 master-2 kubenswrapper[4762]: I1014 13:27:00.686703 4762 scope.go:117] "RemoveContainer" containerID="eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d" Oct 14 13:27:00.712260 master-2 kubenswrapper[4762]: I1014 13:27:00.712219 4762 scope.go:117] "RemoveContainer" containerID="ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee" Oct 14 13:27:00.712843 master-2 kubenswrapper[4762]: E1014 13:27:00.712783 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee\": container with ID starting with ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee not found: ID does not exist" containerID="ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee" Oct 14 13:27:00.712986 master-2 kubenswrapper[4762]: I1014 13:27:00.712850 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee"} err="failed to get container status \"ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee\": rpc error: code = NotFound desc = could not find container \"ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee\": container with ID starting with ed0652c68175fc794e2ffa2234dbeed42e01255f6cb9d2ddbb282664b4f7a5ee not found: ID does not exist" Oct 14 13:27:00.712986 master-2 kubenswrapper[4762]: I1014 13:27:00.712892 4762 scope.go:117] "RemoveContainer" containerID="39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d" Oct 14 13:27:00.713650 master-2 kubenswrapper[4762]: E1014 13:27:00.713502 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d\": container with ID starting with 39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d not found: ID does not exist" containerID="39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d" Oct 14 13:27:00.713711 master-2 kubenswrapper[4762]: I1014 13:27:00.713666 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d"} err="failed to get container status \"39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d\": rpc error: code = NotFound desc = could not find container \"39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d\": container with ID starting with 39786ac6e5f768ebcc690a6e3de90249c4787a641f1d1315d8f81f3458a6927d not found: ID does not exist" Oct 14 13:27:00.713750 master-2 kubenswrapper[4762]: I1014 13:27:00.713716 4762 scope.go:117] "RemoveContainer" containerID="582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410" Oct 14 13:27:00.714198 master-2 kubenswrapper[4762]: E1014 13:27:00.714138 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410\": container with ID starting with 582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410 not found: ID does not exist" containerID="582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410" Oct 14 13:27:00.714269 master-2 kubenswrapper[4762]: I1014 13:27:00.714210 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410"} err="failed to get container status \"582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410\": rpc error: code = NotFound desc = could not find container \"582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410\": container with ID starting with 582963d0d550051a59e2bcba881a40e767b0e4ef68aed930b180274688a29410 not found: ID does not exist" Oct 14 13:27:00.714269 master-2 kubenswrapper[4762]: I1014 13:27:00.714243 4762 scope.go:117] "RemoveContainer" containerID="ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a" Oct 14 13:27:00.714624 master-2 kubenswrapper[4762]: E1014 13:27:00.714597 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a\": container with ID starting with ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a not found: ID does not exist" containerID="ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a" Oct 14 13:27:00.714686 master-2 kubenswrapper[4762]: I1014 13:27:00.714625 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a"} err="failed to get container status \"ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a\": rpc error: code = NotFound desc = could not find container \"ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a\": container with ID starting with ce2586126d7884ea6910fe1b15e6aa80b4bc5e5c292cc36d1df8a8573ef2b88a not found: ID does not exist" Oct 14 13:27:00.714686 master-2 kubenswrapper[4762]: I1014 13:27:00.714644 4762 scope.go:117] "RemoveContainer" containerID="69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a" Oct 14 13:27:00.714907 master-2 kubenswrapper[4762]: E1014 13:27:00.714881 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a\": container with ID starting with 69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a not found: ID does not exist" containerID="69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a" Oct 14 13:27:00.714958 master-2 kubenswrapper[4762]: I1014 13:27:00.714908 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a"} err="failed to get container status \"69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a\": rpc error: code = NotFound desc = could not find container \"69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a\": container with ID starting with 69e03e306b4fc0599756da1bcaf608658ab22550cdb495df2099fa070d1f1e5a not found: ID does not exist" Oct 14 13:27:00.714958 master-2 kubenswrapper[4762]: I1014 13:27:00.714930 4762 scope.go:117] "RemoveContainer" containerID="eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d" Oct 14 13:27:00.715696 master-2 kubenswrapper[4762]: E1014 13:27:00.715278 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d\": container with ID starting with eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d not found: ID does not exist" containerID="eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d" Oct 14 13:27:00.715696 master-2 kubenswrapper[4762]: I1014 13:27:00.715329 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d"} err="failed to get container status \"eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d\": rpc error: code = NotFound desc = could not find container \"eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d\": container with ID starting with eed8af2f07f9d3b49a147d58436d080d78e0511d1542ac74b4feff6605710d6d not found: ID does not exist" Oct 14 13:27:01.561796 master-2 kubenswrapper[4762]: I1014 13:27:01.561712 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9041570beb5002e8da158e70e12f0c16" path="/var/lib/kubelet/pods/9041570beb5002e8da158e70e12f0c16/volumes" Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: I1014 13:27:01.614821 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:01.614889 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:01.616303 master-2 kubenswrapper[4762]: I1014 13:27:01.614908 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: I1014 13:27:02.374826 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:02.374911 master-2 kubenswrapper[4762]: I1014 13:27:02.374887 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:03.853007 master-2 kubenswrapper[4762]: I1014 13:27:03.852930 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" start-of-body= Oct 14 13:27:03.853693 master-2 kubenswrapper[4762]: I1014 13:27:03.853015 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: I1014 13:27:06.616528 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:06.616609 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:06.617898 master-2 kubenswrapper[4762]: I1014 13:27:06.616619 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: I1014 13:27:07.373115 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:07.373232 master-2 kubenswrapper[4762]: I1014 13:27:07.373242 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:07.374175 master-2 kubenswrapper[4762]: I1014 13:27:07.373363 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:27:07.436298 master-2 kubenswrapper[4762]: I1014 13:27:07.436134 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-10-master-2" podStartSLOduration=9.436104809 podStartE2EDuration="9.436104809s" podCreationTimestamp="2025-10-14 13:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:27:00.609778369 +0000 UTC m=+1249.853937558" watchObservedRunningTime="2025-10-14 13:27:07.436104809 +0000 UTC m=+1256.680264008" Oct 14 13:27:07.548638 master-2 kubenswrapper[4762]: I1014 13:27:07.548530 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:07.568867 master-2 kubenswrapper[4762]: I1014 13:27:07.568799 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="3a3084b8-6842-48a7-abe2-93f2c57aad66" Oct 14 13:27:07.568867 master-2 kubenswrapper[4762]: I1014 13:27:07.568854 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" podUID="3a3084b8-6842-48a7-abe2-93f2c57aad66" Oct 14 13:27:07.590045 master-2 kubenswrapper[4762]: I1014 13:27:07.589990 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:07.592142 master-2 kubenswrapper[4762]: I1014 13:27:07.592047 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:27:07.608858 master-2 kubenswrapper[4762]: I1014 13:27:07.608773 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:27:07.617387 master-2 kubenswrapper[4762]: I1014 13:27:07.616525 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:07.622100 master-2 kubenswrapper[4762]: I1014 13:27:07.622026 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-2"] Oct 14 13:27:07.646400 master-2 kubenswrapper[4762]: W1014 13:27:07.646339 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod978811670a28b21932e323b181b31435.slice/crio-9c9acc92a05c1d7d464dabaf4ab58b046ea717859cae00c66ed9176bc1edc008 WatchSource:0}: Error finding container 9c9acc92a05c1d7d464dabaf4ab58b046ea717859cae00c66ed9176bc1edc008: Status 404 returned error can't find the container with id 9c9acc92a05c1d7d464dabaf4ab58b046ea717859cae00c66ed9176bc1edc008 Oct 14 13:27:08.653962 master-2 kubenswrapper[4762]: I1014 13:27:08.653863 4762 generic.go:334] "Generic (PLEG): container finished" podID="978811670a28b21932e323b181b31435" containerID="af552c3799065a96cd2c1575bf89afb0b4604dc32761514583258ace07264294" exitCode=0 Oct 14 13:27:08.653962 master-2 kubenswrapper[4762]: I1014 13:27:08.653929 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerDied","Data":"af552c3799065a96cd2c1575bf89afb0b4604dc32761514583258ace07264294"} Oct 14 13:27:08.653962 master-2 kubenswrapper[4762]: I1014 13:27:08.653961 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"9c9acc92a05c1d7d464dabaf4ab58b046ea717859cae00c66ed9176bc1edc008"} Oct 14 13:27:08.853038 master-2 kubenswrapper[4762]: I1014 13:27:08.852983 4762 patch_prober.go:28] interesting pod/kube-apiserver-guard-master-2 container/guard namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" start-of-body= Oct 14 13:27:08.853187 master-2 kubenswrapper[4762]: I1014 13:27:08.853050 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" podUID="a1d6199c-769e-4363-8439-75d433c50528" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:6443/readyz\": dial tcp 192.168.34.12:6443: connect: connection refused" Oct 14 13:27:09.666177 master-2 kubenswrapper[4762]: I1014 13:27:09.666118 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"dda5926ad7ecd4e5359fff6cfd1cfab8cfe274c521331109bb3094830ee09d08"} Oct 14 13:27:09.666177 master-2 kubenswrapper[4762]: I1014 13:27:09.666175 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"ca6e078154d7a82848e330a08fa3fa715fc4e19d5b0357df2a4e323bcb80b9bc"} Oct 14 13:27:09.666177 master-2 kubenswrapper[4762]: I1014 13:27:09.666184 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"533f662a0b533a61e8b1df790df4d54508e612b55c953bcf0141c3fc1277b52b"} Oct 14 13:27:10.674300 master-2 kubenswrapper[4762]: I1014 13:27:10.674221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"7459e091ad25bcca080ffddfd900cb8c364c715d03c57f7dd26fc4c2bf12e24a"} Oct 14 13:27:10.674300 master-2 kubenswrapper[4762]: I1014 13:27:10.674273 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-2" event={"ID":"978811670a28b21932e323b181b31435","Type":"ContainerStarted","Data":"b1425701dbd4d05c046d1df8b81c0ea354e6bc4cefeccac5b99cfc5864f10e36"} Oct 14 13:27:10.675337 master-2 kubenswrapper[4762]: I1014 13:27:10.675287 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:10.712169 master-2 kubenswrapper[4762]: I1014 13:27:10.712079 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-2" podStartSLOduration=3.712059042 podStartE2EDuration="3.712059042s" podCreationTimestamp="2025-10-14 13:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:27:10.71106808 +0000 UTC m=+1259.955227249" watchObservedRunningTime="2025-10-14 13:27:10.712059042 +0000 UTC m=+1259.956218211" Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: I1014 13:27:11.613425 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:11.613504 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:11.614043 master-2 kubenswrapper[4762]: I1014 13:27:11.613503 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: I1014 13:27:12.373661 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:12.373726 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:12.375803 master-2 kubenswrapper[4762]: I1014 13:27:12.373738 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:12.617690 master-2 kubenswrapper[4762]: I1014 13:27:12.617581 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:12.617690 master-2 kubenswrapper[4762]: I1014 13:27:12.617657 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:12.627441 master-2 kubenswrapper[4762]: I1014 13:27:12.627328 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:12.697744 master-2 kubenswrapper[4762]: I1014 13:27:12.697641 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:13.860797 master-2 kubenswrapper[4762]: I1014 13:27:13.860728 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-guard-master-2" Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: I1014 13:27:16.616835 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:16.616905 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:16.618426 master-2 kubenswrapper[4762]: I1014 13:27:16.616920 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: I1014 13:27:17.372416 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:17.372543 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:17.373463 master-2 kubenswrapper[4762]: I1014 13:27:17.372568 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: I1014 13:27:21.614830 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:21.614917 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:21.616678 master-2 kubenswrapper[4762]: I1014 13:27:21.614927 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: I1014 13:27:22.377296 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:22.377407 master-2 kubenswrapper[4762]: I1014 13:27:22.377388 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: I1014 13:27:26.615958 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:26.616041 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:26.617609 master-2 kubenswrapper[4762]: I1014 13:27:26.616055 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: I1014 13:27:27.372038 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:27.372090 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:27.372876 master-2 kubenswrapper[4762]: I1014 13:27:27.372095 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:27.624639 master-2 kubenswrapper[4762]: I1014 13:27:27.624490 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-2" Oct 14 13:27:31.080689 master-2 kubenswrapper[4762]: I1014 13:27:31.080578 4762 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:27:31.081697 master-2 kubenswrapper[4762]: I1014 13:27:31.081082 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcdctl" containerID="cri-o://3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" gracePeriod=30 Oct 14 13:27:31.081697 master-2 kubenswrapper[4762]: I1014 13:27:31.081208 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-rev" containerID="cri-o://dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" gracePeriod=30 Oct 14 13:27:31.081697 master-2 kubenswrapper[4762]: I1014 13:27:31.081256 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd" containerID="cri-o://1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" gracePeriod=30 Oct 14 13:27:31.081697 master-2 kubenswrapper[4762]: I1014 13:27:31.081287 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-metrics" containerID="cri-o://d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" gracePeriod=30 Oct 14 13:27:31.081697 master-2 kubenswrapper[4762]: I1014 13:27:31.081349 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-2" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-readyz" containerID="cri-o://40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" gracePeriod=30 Oct 14 13:27:31.088202 master-2 kubenswrapper[4762]: I1014 13:27:31.088089 4762 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: E1014 13:27:31.088447 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="setup" Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: I1014 13:27:31.088477 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="setup" Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: E1014 13:27:31.088500 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-resources-copy" Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: I1014 13:27:31.088510 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-resources-copy" Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: E1014 13:27:31.088529 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-readyz" Oct 14 13:27:31.088534 master-2 kubenswrapper[4762]: I1014 13:27:31.088540 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-readyz" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: E1014 13:27:31.088559 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-metrics" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.088570 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-metrics" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: E1014 13:27:31.088749 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-ensure-env-vars" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.088764 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-ensure-env-vars" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: E1014 13:27:31.088804 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-rev" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.088814 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-rev" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: E1014 13:27:31.088826 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcdctl" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.088837 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcdctl" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: E1014 13:27:31.088853 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.088863 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.089052 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.089068 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-rev" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.089085 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-metrics" Oct 14 13:27:31.089073 master-2 kubenswrapper[4762]: I1014 13:27:31.089097 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4a583adfee975da84510940117e71a" containerName="etcd-readyz" Oct 14 13:27:31.090214 master-2 kubenswrapper[4762]: I1014 13:27:31.089116 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4a583adfee975da84510940117e71a" containerName="etcdctl" Oct 14 13:27:31.252379 master-2 kubenswrapper[4762]: I1014 13:27:31.252337 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-resource-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.252472 master-2 kubenswrapper[4762]: I1014 13:27:31.252402 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-cert-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.252472 master-2 kubenswrapper[4762]: I1014 13:27:31.252421 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-static-pod-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.252472 master-2 kubenswrapper[4762]: I1014 13:27:31.252452 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-log-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.252472 master-2 kubenswrapper[4762]: I1014 13:27:31.252467 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-usr-local-bin\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.252932 master-2 kubenswrapper[4762]: I1014 13:27:31.252896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-data-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354435 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-log-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354493 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-usr-local-bin\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354530 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-data-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354551 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-log-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354572 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-resource-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.354598 master-2 kubenswrapper[4762]: I1014 13:27:31.354603 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-usr-local-bin\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354627 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-resource-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354642 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-cert-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354677 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-static-pod-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354679 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-cert-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354704 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-data-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.355096 master-2 kubenswrapper[4762]: I1014 13:27:31.354790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/cd7826f9db5842f000a071fd58a1ae79-static-pod-dir\") pod \"etcd-master-2\" (UID: \"cd7826f9db5842f000a071fd58a1ae79\") " pod="openshift-etcd/etcd-master-2" Oct 14 13:27:31.595556 master-2 kubenswrapper[4762]: I1014 13:27:31.595440 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="2c4a583adfee975da84510940117e71a" podUID="cd7826f9db5842f000a071fd58a1ae79" Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: I1014 13:27:31.615676 4762 patch_prober.go:28] interesting pod/apiserver-84c8b8d745-p4css container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartUserInformer ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartOAuthInformer ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:31.617065 master-2 kubenswrapper[4762]: I1014 13:27:31.615734 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:31.829565 master-2 kubenswrapper[4762]: I1014 13:27:31.829463 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-rev/0.log" Oct 14 13:27:31.831399 master-2 kubenswrapper[4762]: I1014 13:27:31.831344 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-metrics/0.log" Oct 14 13:27:31.834445 master-2 kubenswrapper[4762]: I1014 13:27:31.834374 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" exitCode=2 Oct 14 13:27:31.834445 master-2 kubenswrapper[4762]: I1014 13:27:31.834425 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" exitCode=0 Oct 14 13:27:31.834445 master-2 kubenswrapper[4762]: I1014 13:27:31.834441 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" exitCode=2 Oct 14 13:27:31.837004 master-2 kubenswrapper[4762]: I1014 13:27:31.836940 4762 generic.go:334] "Generic (PLEG): container finished" podID="e8558a2f-5ea7-42a3-b00d-2ffbb553f642" containerID="090eaf196ae20f28142493f42c17816526e3b37829abd5890ab50883ad601d6b" exitCode=0 Oct 14 13:27:31.837215 master-2 kubenswrapper[4762]: I1014 13:27:31.836995 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-2" event={"ID":"e8558a2f-5ea7-42a3-b00d-2ffbb553f642","Type":"ContainerDied","Data":"090eaf196ae20f28142493f42c17816526e3b37829abd5890ab50883ad601d6b"} Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: I1014 13:27:32.372611 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:32.372668 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:32.374423 master-2 kubenswrapper[4762]: I1014 13:27:32.373319 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:33.161295 master-2 kubenswrapper[4762]: I1014 13:27:33.161243 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-2" Oct 14 13:27:33.181761 master-2 kubenswrapper[4762]: I1014 13:27:33.181703 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock\") pod \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " Oct 14 13:27:33.181894 master-2 kubenswrapper[4762]: I1014 13:27:33.181789 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access\") pod \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " Oct 14 13:27:33.181894 master-2 kubenswrapper[4762]: I1014 13:27:33.181832 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir\") pod \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\" (UID: \"e8558a2f-5ea7-42a3-b00d-2ffbb553f642\") " Oct 14 13:27:33.182062 master-2 kubenswrapper[4762]: I1014 13:27:33.181956 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock" (OuterVolumeSpecName: "var-lock") pod "e8558a2f-5ea7-42a3-b00d-2ffbb553f642" (UID: "e8558a2f-5ea7-42a3-b00d-2ffbb553f642"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:33.182109 master-2 kubenswrapper[4762]: I1014 13:27:33.182077 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8558a2f-5ea7-42a3-b00d-2ffbb553f642" (UID: "e8558a2f-5ea7-42a3-b00d-2ffbb553f642"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:33.187403 master-2 kubenswrapper[4762]: I1014 13:27:33.187342 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8558a2f-5ea7-42a3-b00d-2ffbb553f642" (UID: "e8558a2f-5ea7-42a3-b00d-2ffbb553f642"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:27:33.282995 master-2 kubenswrapper[4762]: I1014 13:27:33.282895 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:33.282995 master-2 kubenswrapper[4762]: I1014 13:27:33.282943 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:33.282995 master-2 kubenswrapper[4762]: I1014 13:27:33.282956 4762 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8558a2f-5ea7-42a3-b00d-2ffbb553f642-var-lock\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:33.849901 master-2 kubenswrapper[4762]: I1014 13:27:33.849748 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-10-master-2" event={"ID":"e8558a2f-5ea7-42a3-b00d-2ffbb553f642","Type":"ContainerDied","Data":"8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9"} Oct 14 13:27:33.849901 master-2 kubenswrapper[4762]: I1014 13:27:33.849794 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8607dcf920a7dcf04e80e33639f54fd94e16bf136914cc3931acf7abd5f9a0a9" Oct 14 13:27:33.849901 master-2 kubenswrapper[4762]: I1014 13:27:33.849859 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-10-master-2" Oct 14 13:27:35.335533 master-2 kubenswrapper[4762]: I1014 13:27:35.335459 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:35.336585 master-2 kubenswrapper[4762]: I1014 13:27:35.335732 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:35.865361 master-2 kubenswrapper[4762]: I1014 13:27:35.865323 4762 generic.go:334] "Generic (PLEG): container finished" podID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerID="ba902ca859f0fcfe992aebe277dcc7b1ce0c63eee0caf6006314ea48d7bec6a3" exitCode=0 Oct 14 13:27:35.865565 master-2 kubenswrapper[4762]: I1014 13:27:35.865422 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" event={"ID":"df155e80-7f1a-4919-b7a9-5df5cbb92c27","Type":"ContainerDied","Data":"ba902ca859f0fcfe992aebe277dcc7b1ce0c63eee0caf6006314ea48d7bec6a3"} Oct 14 13:27:36.130681 master-2 kubenswrapper[4762]: I1014 13:27:36.130644 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:27:36.225111 master-2 kubenswrapper[4762]: I1014 13:27:36.225014 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.225398 master-2 kubenswrapper[4762]: I1014 13:27:36.225209 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.225398 master-2 kubenswrapper[4762]: I1014 13:27:36.225292 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.225398 master-2 kubenswrapper[4762]: I1014 13:27:36.225367 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7bzj\" (UniqueName: \"kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.225598 master-2 kubenswrapper[4762]: I1014 13:27:36.225468 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.226267 master-2 kubenswrapper[4762]: I1014 13:27:36.225659 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.226267 master-2 kubenswrapper[4762]: I1014 13:27:36.225738 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.226267 master-2 kubenswrapper[4762]: I1014 13:27:36.225829 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client\") pod \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\" (UID: \"df155e80-7f1a-4919-b7a9-5df5cbb92c27\") " Oct 14 13:27:36.226267 master-2 kubenswrapper[4762]: I1014 13:27:36.225945 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:27:36.226267 master-2 kubenswrapper[4762]: I1014 13:27:36.226229 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:27:36.226573 master-2 kubenswrapper[4762]: I1014 13:27:36.226390 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:27:36.227201 master-2 kubenswrapper[4762]: I1014 13:27:36.227150 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:27:36.227965 master-2 kubenswrapper[4762]: I1014 13:27:36.227516 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.227965 master-2 kubenswrapper[4762]: I1014 13:27:36.227573 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.227965 master-2 kubenswrapper[4762]: I1014 13:27:36.227666 4762 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-policies\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.227965 master-2 kubenswrapper[4762]: I1014 13:27:36.227691 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df155e80-7f1a-4919-b7a9-5df5cbb92c27-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.229662 master-2 kubenswrapper[4762]: I1014 13:27:36.229585 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:27:36.231127 master-2 kubenswrapper[4762]: I1014 13:27:36.231075 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj" (OuterVolumeSpecName: "kube-api-access-s7bzj") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "kube-api-access-s7bzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:27:36.231281 master-2 kubenswrapper[4762]: I1014 13:27:36.231116 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:27:36.231557 master-2 kubenswrapper[4762]: I1014 13:27:36.231490 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "df155e80-7f1a-4919-b7a9-5df5cbb92c27" (UID: "df155e80-7f1a-4919-b7a9-5df5cbb92c27"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:27:36.329287 master-2 kubenswrapper[4762]: I1014 13:27:36.329142 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.329287 master-2 kubenswrapper[4762]: I1014 13:27:36.329202 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.329287 master-2 kubenswrapper[4762]: I1014 13:27:36.329216 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df155e80-7f1a-4919-b7a9-5df5cbb92c27-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.329287 master-2 kubenswrapper[4762]: I1014 13:27:36.329229 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7bzj\" (UniqueName: \"kubernetes.io/projected/df155e80-7f1a-4919-b7a9-5df5cbb92c27-kube-api-access-s7bzj\") on node \"master-2\" DevicePath \"\"" Oct 14 13:27:36.875098 master-2 kubenswrapper[4762]: I1014 13:27:36.875048 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" event={"ID":"df155e80-7f1a-4919-b7a9-5df5cbb92c27","Type":"ContainerDied","Data":"0d8df32e5d37443da680373c58a291c75e438acf1328cfa8131c4168ef686ed5"} Oct 14 13:27:36.875880 master-2 kubenswrapper[4762]: I1014 13:27:36.875223 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-84c8b8d745-p4css" Oct 14 13:27:36.875990 master-2 kubenswrapper[4762]: I1014 13:27:36.875750 4762 scope.go:117] "RemoveContainer" containerID="ba902ca859f0fcfe992aebe277dcc7b1ce0c63eee0caf6006314ea48d7bec6a3" Oct 14 13:27:36.908300 master-2 kubenswrapper[4762]: I1014 13:27:36.908082 4762 scope.go:117] "RemoveContainer" containerID="eb606b021f2b7421df740b33c8a58df29792e7ef7cffd19b438bd855e5061ed9" Oct 14 13:27:36.939572 master-2 kubenswrapper[4762]: I1014 13:27:36.939511 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:27:36.943871 master-2 kubenswrapper[4762]: I1014 13:27:36.943803 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-84c8b8d745-p4css"] Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: I1014 13:27:37.374811 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:37.374911 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:37.375834 master-2 kubenswrapper[4762]: I1014 13:27:37.374929 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:37.560948 master-2 kubenswrapper[4762]: I1014 13:27:37.560854 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" path="/var/lib/kubelet/pods/df155e80-7f1a-4919-b7a9-5df5cbb92c27/volumes" Oct 14 13:27:40.335848 master-2 kubenswrapper[4762]: I1014 13:27:40.335772 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:40.336990 master-2 kubenswrapper[4762]: I1014 13:27:40.335853 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:41.539993 master-2 kubenswrapper[4762]: I1014 13:27:41.539875 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp"] Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: E1014 13:27:41.540261 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: I1014 13:27:41.540281 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: E1014 13:27:41.540299 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8558a2f-5ea7-42a3-b00d-2ffbb553f642" containerName="installer" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: I1014 13:27:41.540306 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8558a2f-5ea7-42a3-b00d-2ffbb553f642" containerName="installer" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: E1014 13:27:41.540321 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="fix-audit-permissions" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: I1014 13:27:41.540328 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="fix-audit-permissions" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: I1014 13:27:41.540438 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8558a2f-5ea7-42a3-b00d-2ffbb553f642" containerName="installer" Oct 14 13:27:41.540822 master-2 kubenswrapper[4762]: I1014 13:27:41.540451 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="df155e80-7f1a-4919-b7a9-5df5cbb92c27" containerName="oauth-apiserver" Oct 14 13:27:41.541481 master-2 kubenswrapper[4762]: I1014 13:27:41.541314 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.545552 master-2 kubenswrapper[4762]: I1014 13:27:41.545483 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-8gpjk" Oct 14 13:27:41.545745 master-2 kubenswrapper[4762]: I1014 13:27:41.545680 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 14 13:27:41.545856 master-2 kubenswrapper[4762]: I1014 13:27:41.545792 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 14 13:27:41.546177 master-2 kubenswrapper[4762]: I1014 13:27:41.545904 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 14 13:27:41.546512 master-2 kubenswrapper[4762]: I1014 13:27:41.546464 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 14 13:27:41.546661 master-2 kubenswrapper[4762]: I1014 13:27:41.546571 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 14 13:27:41.547018 master-2 kubenswrapper[4762]: I1014 13:27:41.546988 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 14 13:27:41.547277 master-2 kubenswrapper[4762]: I1014 13:27:41.547236 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 14 13:27:41.548546 master-2 kubenswrapper[4762]: I1014 13:27:41.548497 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 14 13:27:41.564356 master-2 kubenswrapper[4762]: I1014 13:27:41.564285 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp"] Oct 14 13:27:41.703483 master-2 kubenswrapper[4762]: I1014 13:27:41.703379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-serving-cert\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703483 master-2 kubenswrapper[4762]: I1014 13:27:41.703479 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-encryption-config\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703890 master-2 kubenswrapper[4762]: I1014 13:27:41.703514 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-policies\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703890 master-2 kubenswrapper[4762]: I1014 13:27:41.703553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9l96\" (UniqueName: \"kubernetes.io/projected/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-kube-api-access-w9l96\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703890 master-2 kubenswrapper[4762]: I1014 13:27:41.703577 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-dir\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703890 master-2 kubenswrapper[4762]: I1014 13:27:41.703653 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-serving-ca\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.703890 master-2 kubenswrapper[4762]: I1014 13:27:41.703678 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-client\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.704364 master-2 kubenswrapper[4762]: I1014 13:27:41.703903 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-trusted-ca-bundle\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.804899 master-2 kubenswrapper[4762]: I1014 13:27:41.804734 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-client\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805261 master-2 kubenswrapper[4762]: I1014 13:27:41.805245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-trusted-ca-bundle\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805375 master-2 kubenswrapper[4762]: I1014 13:27:41.805361 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-serving-cert\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805485 master-2 kubenswrapper[4762]: I1014 13:27:41.805471 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-encryption-config\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805578 master-2 kubenswrapper[4762]: I1014 13:27:41.805563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-policies\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805677 master-2 kubenswrapper[4762]: I1014 13:27:41.805663 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9l96\" (UniqueName: \"kubernetes.io/projected/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-kube-api-access-w9l96\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805753 master-2 kubenswrapper[4762]: I1014 13:27:41.805740 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-dir\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.805846 master-2 kubenswrapper[4762]: I1014 13:27:41.805834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-serving-ca\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.806043 master-2 kubenswrapper[4762]: I1014 13:27:41.805951 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-dir\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.806377 master-2 kubenswrapper[4762]: I1014 13:27:41.806327 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-audit-policies\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.806623 master-2 kubenswrapper[4762]: I1014 13:27:41.806602 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-trusted-ca-bundle\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.806728 master-2 kubenswrapper[4762]: I1014 13:27:41.806617 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-serving-ca\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.808706 master-2 kubenswrapper[4762]: I1014 13:27:41.808660 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-encryption-config\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.809078 master-2 kubenswrapper[4762]: I1014 13:27:41.809017 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-etcd-client\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.814246 master-2 kubenswrapper[4762]: I1014 13:27:41.814020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-serving-cert\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.847122 master-2 kubenswrapper[4762]: I1014 13:27:41.847053 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9l96\" (UniqueName: \"kubernetes.io/projected/7f841a64-d2fd-44ed-b3e6-acdc127cacfc-kube-api-access-w9l96\") pod \"apiserver-7b6784d654-8vpmp\" (UID: \"7f841a64-d2fd-44ed-b3e6-acdc127cacfc\") " pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:41.868932 master-2 kubenswrapper[4762]: I1014 13:27:41.868877 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:42.321413 master-2 kubenswrapper[4762]: I1014 13:27:42.321317 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp"] Oct 14 13:27:42.331022 master-2 kubenswrapper[4762]: W1014 13:27:42.330931 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f841a64_d2fd_44ed_b3e6_acdc127cacfc.slice/crio-8c2f5a9cf4c0c74921f5c70f1c5cc87d7661d4dba0cf07c97d9949ccce83c4ef WatchSource:0}: Error finding container 8c2f5a9cf4c0c74921f5c70f1c5cc87d7661d4dba0cf07c97d9949ccce83c4ef: Status 404 returned error can't find the container with id 8c2f5a9cf4c0c74921f5c70f1c5cc87d7661d4dba0cf07c97d9949ccce83c4ef Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: I1014 13:27:42.371673 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]log ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]etcd excluded: ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]etcd-readiness excluded: ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]informer-sync ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/generic-apiserver-start-informers ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/max-in-flight-filter ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectcache ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-startinformers ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: [-]shutdown failed: reason withheld Oct 14 13:27:42.371752 master-2 kubenswrapper[4762]: readyz check failed Oct 14 13:27:42.372903 master-2 kubenswrapper[4762]: I1014 13:27:42.371765 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 14 13:27:42.927179 master-2 kubenswrapper[4762]: I1014 13:27:42.926488 4762 generic.go:334] "Generic (PLEG): container finished" podID="7f841a64-d2fd-44ed-b3e6-acdc127cacfc" containerID="1ed3c3ea6150e3c9511cbf541f3a528266d1719073b1947041df5f2121893b4c" exitCode=0 Oct 14 13:27:42.927179 master-2 kubenswrapper[4762]: I1014 13:27:42.926553 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" event={"ID":"7f841a64-d2fd-44ed-b3e6-acdc127cacfc","Type":"ContainerDied","Data":"1ed3c3ea6150e3c9511cbf541f3a528266d1719073b1947041df5f2121893b4c"} Oct 14 13:27:42.927179 master-2 kubenswrapper[4762]: I1014 13:27:42.926590 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" event={"ID":"7f841a64-d2fd-44ed-b3e6-acdc127cacfc","Type":"ContainerStarted","Data":"8c2f5a9cf4c0c74921f5c70f1c5cc87d7661d4dba0cf07c97d9949ccce83c4ef"} Oct 14 13:27:43.936990 master-2 kubenswrapper[4762]: I1014 13:27:43.936888 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" event={"ID":"7f841a64-d2fd-44ed-b3e6-acdc127cacfc","Type":"ContainerStarted","Data":"2bb8794ceab89121f46fe9de283036d021f45d731352c4495778b1ff844dc59d"} Oct 14 13:27:44.017972 master-2 kubenswrapper[4762]: I1014 13:27:44.017882 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" podStartSLOduration=61.017863081 podStartE2EDuration="1m1.017863081s" podCreationTimestamp="2025-10-14 13:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:27:44.015618999 +0000 UTC m=+1293.259778188" watchObservedRunningTime="2025-10-14 13:27:44.017863081 +0000 UTC m=+1293.262022240" Oct 14 13:27:45.335764 master-2 kubenswrapper[4762]: I1014 13:27:45.335654 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:45.335764 master-2 kubenswrapper[4762]: I1014 13:27:45.335753 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:45.336719 master-2 kubenswrapper[4762]: I1014 13:27:45.335890 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:27:45.336906 master-2 kubenswrapper[4762]: I1014 13:27:45.336832 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:45.336996 master-2 kubenswrapper[4762]: I1014 13:27:45.336926 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:46.869487 master-2 kubenswrapper[4762]: I1014 13:27:46.869409 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:46.870545 master-2 kubenswrapper[4762]: I1014 13:27:46.869564 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:46.881423 master-2 kubenswrapper[4762]: I1014 13:27:46.881296 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:46.962384 master-2 kubenswrapper[4762]: I1014 13:27:46.962309 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7b6784d654-8vpmp" Oct 14 13:27:47.369093 master-2 kubenswrapper[4762]: I1014 13:27:47.368997 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:27:47.369093 master-2 kubenswrapper[4762]: I1014 13:27:47.369074 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:27:50.334944 master-2 kubenswrapper[4762]: I1014 13:27:50.334848 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:50.334944 master-2 kubenswrapper[4762]: I1014 13:27:50.334935 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:52.369087 master-2 kubenswrapper[4762]: I1014 13:27:52.369013 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:27:52.369838 master-2 kubenswrapper[4762]: I1014 13:27:52.369096 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:27:55.335113 master-2 kubenswrapper[4762]: I1014 13:27:55.335008 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:27:55.336147 master-2 kubenswrapper[4762]: I1014 13:27:55.335133 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:27:57.368045 master-2 kubenswrapper[4762]: I1014 13:27:57.367965 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:27:57.368045 master-2 kubenswrapper[4762]: I1014 13:27:57.368050 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:00.335784 master-2 kubenswrapper[4762]: I1014 13:28:00.335728 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:28:00.336776 master-2 kubenswrapper[4762]: I1014 13:28:00.336398 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:28:01.660340 master-2 kubenswrapper[4762]: I1014 13:28:01.660296 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-rev/0.log" Oct 14 13:28:01.661566 master-2 kubenswrapper[4762]: I1014 13:28:01.661519 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-metrics/0.log" Oct 14 13:28:01.662381 master-2 kubenswrapper[4762]: I1014 13:28:01.662336 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd/0.log" Oct 14 13:28:01.662879 master-2 kubenswrapper[4762]: I1014 13:28:01.662836 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcdctl/0.log" Oct 14 13:28:01.665262 master-2 kubenswrapper[4762]: I1014 13:28:01.664435 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803289 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803409 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803476 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803520 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803641 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803721 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803778 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.803821 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir\") pod \"2c4a583adfee975da84510940117e71a\" (UID: \"2c4a583adfee975da84510940117e71a\") " Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804696 4762 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-cert-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804733 4762 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-resource-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804825 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804882 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir" (OuterVolumeSpecName: "log-dir") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804932 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir" (OuterVolumeSpecName: "data-dir") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.811923 master-2 kubenswrapper[4762]: I1014 13:28:01.804968 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "2c4a583adfee975da84510940117e71a" (UID: "2c4a583adfee975da84510940117e71a"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:01.905845 master-2 kubenswrapper[4762]: I1014 13:28:01.905656 4762 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-log-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:01.905845 master-2 kubenswrapper[4762]: I1014 13:28:01.905714 4762 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-data-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:01.905845 master-2 kubenswrapper[4762]: I1014 13:28:01.905788 4762 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-usr-local-bin\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:01.905845 master-2 kubenswrapper[4762]: I1014 13:28:01.905801 4762 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2c4a583adfee975da84510940117e71a-static-pod-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:02.058464 master-2 kubenswrapper[4762]: I1014 13:28:02.058389 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-rev/0.log" Oct 14 13:28:02.059977 master-2 kubenswrapper[4762]: I1014 13:28:02.059931 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd-metrics/0.log" Oct 14 13:28:02.061007 master-2 kubenswrapper[4762]: I1014 13:28:02.060949 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcd/0.log" Oct 14 13:28:02.061660 master-2 kubenswrapper[4762]: I1014 13:28:02.061617 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_2c4a583adfee975da84510940117e71a/etcdctl/0.log" Oct 14 13:28:02.063206 master-2 kubenswrapper[4762]: I1014 13:28:02.063112 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" exitCode=137 Oct 14 13:28:02.063206 master-2 kubenswrapper[4762]: I1014 13:28:02.063158 4762 generic.go:334] "Generic (PLEG): container finished" podID="2c4a583adfee975da84510940117e71a" containerID="3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" exitCode=137 Oct 14 13:28:02.063424 master-2 kubenswrapper[4762]: I1014 13:28:02.063249 4762 scope.go:117] "RemoveContainer" containerID="dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" Oct 14 13:28:02.063424 master-2 kubenswrapper[4762]: I1014 13:28:02.063363 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:02.087652 master-2 kubenswrapper[4762]: I1014 13:28:02.087602 4762 scope.go:117] "RemoveContainer" containerID="40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" Oct 14 13:28:02.110801 master-2 kubenswrapper[4762]: I1014 13:28:02.110737 4762 scope.go:117] "RemoveContainer" containerID="d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" Oct 14 13:28:02.131560 master-2 kubenswrapper[4762]: I1014 13:28:02.131499 4762 scope.go:117] "RemoveContainer" containerID="1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" Oct 14 13:28:02.150807 master-2 kubenswrapper[4762]: I1014 13:28:02.150762 4762 scope.go:117] "RemoveContainer" containerID="3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" Oct 14 13:28:02.164780 master-2 kubenswrapper[4762]: I1014 13:28:02.164742 4762 scope.go:117] "RemoveContainer" containerID="ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d" Oct 14 13:28:02.190192 master-2 kubenswrapper[4762]: I1014 13:28:02.190110 4762 scope.go:117] "RemoveContainer" containerID="0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a" Oct 14 13:28:02.219101 master-2 kubenswrapper[4762]: I1014 13:28:02.219051 4762 scope.go:117] "RemoveContainer" containerID="26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6" Oct 14 13:28:02.252868 master-2 kubenswrapper[4762]: I1014 13:28:02.252816 4762 scope.go:117] "RemoveContainer" containerID="dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" Oct 14 13:28:02.253421 master-2 kubenswrapper[4762]: E1014 13:28:02.253376 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee\": container with ID starting with dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee not found: ID does not exist" containerID="dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" Oct 14 13:28:02.253515 master-2 kubenswrapper[4762]: I1014 13:28:02.253420 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee"} err="failed to get container status \"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee\": rpc error: code = NotFound desc = could not find container \"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee\": container with ID starting with dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee not found: ID does not exist" Oct 14 13:28:02.253515 master-2 kubenswrapper[4762]: I1014 13:28:02.253456 4762 scope.go:117] "RemoveContainer" containerID="40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" Oct 14 13:28:02.253896 master-2 kubenswrapper[4762]: E1014 13:28:02.253836 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f\": container with ID starting with 40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f not found: ID does not exist" containerID="40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" Oct 14 13:28:02.254021 master-2 kubenswrapper[4762]: I1014 13:28:02.253886 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f"} err="failed to get container status \"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f\": rpc error: code = NotFound desc = could not find container \"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f\": container with ID starting with 40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f not found: ID does not exist" Oct 14 13:28:02.254021 master-2 kubenswrapper[4762]: I1014 13:28:02.253923 4762 scope.go:117] "RemoveContainer" containerID="d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" Oct 14 13:28:02.254478 master-2 kubenswrapper[4762]: E1014 13:28:02.254441 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690\": container with ID starting with d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690 not found: ID does not exist" containerID="d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" Oct 14 13:28:02.254610 master-2 kubenswrapper[4762]: I1014 13:28:02.254476 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690"} err="failed to get container status \"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690\": rpc error: code = NotFound desc = could not find container \"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690\": container with ID starting with d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690 not found: ID does not exist" Oct 14 13:28:02.254610 master-2 kubenswrapper[4762]: I1014 13:28:02.254497 4762 scope.go:117] "RemoveContainer" containerID="1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" Oct 14 13:28:02.254849 master-2 kubenswrapper[4762]: E1014 13:28:02.254745 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59\": container with ID starting with 1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59 not found: ID does not exist" containerID="1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" Oct 14 13:28:02.254849 master-2 kubenswrapper[4762]: I1014 13:28:02.254770 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59"} err="failed to get container status \"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59\": rpc error: code = NotFound desc = could not find container \"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59\": container with ID starting with 1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59 not found: ID does not exist" Oct 14 13:28:02.254849 master-2 kubenswrapper[4762]: I1014 13:28:02.254785 4762 scope.go:117] "RemoveContainer" containerID="3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" Oct 14 13:28:02.255361 master-2 kubenswrapper[4762]: E1014 13:28:02.255026 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d\": container with ID starting with 3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d not found: ID does not exist" containerID="3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" Oct 14 13:28:02.255361 master-2 kubenswrapper[4762]: I1014 13:28:02.255057 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d"} err="failed to get container status \"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d\": rpc error: code = NotFound desc = could not find container \"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d\": container with ID starting with 3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d not found: ID does not exist" Oct 14 13:28:02.255361 master-2 kubenswrapper[4762]: I1014 13:28:02.255081 4762 scope.go:117] "RemoveContainer" containerID="ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d" Oct 14 13:28:02.255641 master-2 kubenswrapper[4762]: E1014 13:28:02.255370 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d\": container with ID starting with ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d not found: ID does not exist" containerID="ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d" Oct 14 13:28:02.255641 master-2 kubenswrapper[4762]: I1014 13:28:02.255401 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d"} err="failed to get container status \"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d\": rpc error: code = NotFound desc = could not find container \"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d\": container with ID starting with ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d not found: ID does not exist" Oct 14 13:28:02.255641 master-2 kubenswrapper[4762]: I1014 13:28:02.255426 4762 scope.go:117] "RemoveContainer" containerID="0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a" Oct 14 13:28:02.255898 master-2 kubenswrapper[4762]: E1014 13:28:02.255645 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a\": container with ID starting with 0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a not found: ID does not exist" containerID="0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a" Oct 14 13:28:02.255898 master-2 kubenswrapper[4762]: I1014 13:28:02.255682 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a"} err="failed to get container status \"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a\": rpc error: code = NotFound desc = could not find container \"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a\": container with ID starting with 0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a not found: ID does not exist" Oct 14 13:28:02.255898 master-2 kubenswrapper[4762]: I1014 13:28:02.255699 4762 scope.go:117] "RemoveContainer" containerID="26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6" Oct 14 13:28:02.256225 master-2 kubenswrapper[4762]: E1014 13:28:02.255903 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6\": container with ID starting with 26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6 not found: ID does not exist" containerID="26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6" Oct 14 13:28:02.256225 master-2 kubenswrapper[4762]: I1014 13:28:02.255936 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6"} err="failed to get container status \"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6\": rpc error: code = NotFound desc = could not find container \"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6\": container with ID starting with 26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6 not found: ID does not exist" Oct 14 13:28:02.256225 master-2 kubenswrapper[4762]: I1014 13:28:02.255953 4762 scope.go:117] "RemoveContainer" containerID="dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee" Oct 14 13:28:02.256225 master-2 kubenswrapper[4762]: I1014 13:28:02.256212 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee"} err="failed to get container status \"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee\": rpc error: code = NotFound desc = could not find container \"dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee\": container with ID starting with dea4154badc86da86e08dc8adfcdf16b04b6fb05ec19bc716137bdc9579ffcee not found: ID does not exist" Oct 14 13:28:02.256225 master-2 kubenswrapper[4762]: I1014 13:28:02.256232 4762 scope.go:117] "RemoveContainer" containerID="40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f" Oct 14 13:28:02.256641 master-2 kubenswrapper[4762]: I1014 13:28:02.256485 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f"} err="failed to get container status \"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f\": rpc error: code = NotFound desc = could not find container \"40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f\": container with ID starting with 40d36d63e9ceeaccf501e24cb94ae4d10c179c30f3ce44780e8040f4e3011e5f not found: ID does not exist" Oct 14 13:28:02.256641 master-2 kubenswrapper[4762]: I1014 13:28:02.256512 4762 scope.go:117] "RemoveContainer" containerID="d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690" Oct 14 13:28:02.256827 master-2 kubenswrapper[4762]: I1014 13:28:02.256730 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690"} err="failed to get container status \"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690\": rpc error: code = NotFound desc = could not find container \"d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690\": container with ID starting with d26868986c4341ffa9a0a23ef35041cefee7b967d7d9ee87bf0aeed31b68e690 not found: ID does not exist" Oct 14 13:28:02.256827 master-2 kubenswrapper[4762]: I1014 13:28:02.256765 4762 scope.go:117] "RemoveContainer" containerID="1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59" Oct 14 13:28:02.257002 master-2 kubenswrapper[4762]: I1014 13:28:02.256983 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59"} err="failed to get container status \"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59\": rpc error: code = NotFound desc = could not find container \"1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59\": container with ID starting with 1b73e44d6a874336664c315d070c89fff83d384d584c26da8d5a7492d7b1ae59 not found: ID does not exist" Oct 14 13:28:02.257100 master-2 kubenswrapper[4762]: I1014 13:28:02.257004 4762 scope.go:117] "RemoveContainer" containerID="3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d" Oct 14 13:28:02.257294 master-2 kubenswrapper[4762]: I1014 13:28:02.257240 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d"} err="failed to get container status \"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d\": rpc error: code = NotFound desc = could not find container \"3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d\": container with ID starting with 3f91268e8f5c7ecd370dc313d1dc019c99ab69ffa99e1690f5d45a66f3f3be9d not found: ID does not exist" Oct 14 13:28:02.257294 master-2 kubenswrapper[4762]: I1014 13:28:02.257268 4762 scope.go:117] "RemoveContainer" containerID="ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d" Oct 14 13:28:02.257491 master-2 kubenswrapper[4762]: I1014 13:28:02.257464 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d"} err="failed to get container status \"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d\": rpc error: code = NotFound desc = could not find container \"ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d\": container with ID starting with ebea60d23e3f11da81fb82ddf488e702d9df05ce3b6ae87609e07e9419bc7b0d not found: ID does not exist" Oct 14 13:28:02.257491 master-2 kubenswrapper[4762]: I1014 13:28:02.257484 4762 scope.go:117] "RemoveContainer" containerID="0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a" Oct 14 13:28:02.257712 master-2 kubenswrapper[4762]: I1014 13:28:02.257674 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a"} err="failed to get container status \"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a\": rpc error: code = NotFound desc = could not find container \"0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a\": container with ID starting with 0c5867abb4fd355dbcd792fe60324abb72b9b84b35d2143b43d94cff5e9f798a not found: ID does not exist" Oct 14 13:28:02.257712 master-2 kubenswrapper[4762]: I1014 13:28:02.257701 4762 scope.go:117] "RemoveContainer" containerID="26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6" Oct 14 13:28:02.257908 master-2 kubenswrapper[4762]: I1014 13:28:02.257887 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6"} err="failed to get container status \"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6\": rpc error: code = NotFound desc = could not find container \"26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6\": container with ID starting with 26c9c1090f4933547e0a56852a2e81f38612c658f6488ebbef87141243935ad6 not found: ID does not exist" Oct 14 13:28:02.368712 master-2 kubenswrapper[4762]: I1014 13:28:02.368605 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:02.368923 master-2 kubenswrapper[4762]: I1014 13:28:02.368721 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:03.559223 master-2 kubenswrapper[4762]: I1014 13:28:03.559127 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4a583adfee975da84510940117e71a" path="/var/lib/kubelet/pods/2c4a583adfee975da84510940117e71a/volumes" Oct 14 13:28:03.629251 master-2 kubenswrapper[4762]: I1014 13:28:03.629143 4762 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-etcd/etcd-master-2" oldPodUID="2c4a583adfee975da84510940117e71a" podUID="cd7826f9db5842f000a071fd58a1ae79" Oct 14 13:28:05.334862 master-2 kubenswrapper[4762]: I1014 13:28:05.334765 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:28:05.334862 master-2 kubenswrapper[4762]: I1014 13:28:05.334843 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:28:07.368478 master-2 kubenswrapper[4762]: I1014 13:28:07.368357 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:07.369563 master-2 kubenswrapper[4762]: I1014 13:28:07.368497 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:10.335656 master-2 kubenswrapper[4762]: I1014 13:28:10.335590 4762 patch_prober.go:28] interesting pod/etcd-guard-master-2 container/guard namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" start-of-body= Oct 14 13:28:10.336185 master-2 kubenswrapper[4762]: I1014 13:28:10.335671 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-guard-master-2" podUID="1b6a1dbe-f753-4c92-8b36-47517010f2f3" containerName="guard" probeResult="failure" output="Get \"https://192.168.34.12:9980/readyz\": dial tcp 192.168.34.12:9980: connect: connection refused" Oct 14 13:28:10.548502 master-2 kubenswrapper[4762]: I1014 13:28:10.548380 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:10.572757 master-2 kubenswrapper[4762]: I1014 13:28:10.572675 4762 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-2" podUID="c1390037-f3e1-4ffb-8ca7-47bc5d79c0a9" Oct 14 13:28:10.572757 master-2 kubenswrapper[4762]: I1014 13:28:10.572731 4762 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-2" podUID="c1390037-f3e1-4ffb-8ca7-47bc5d79c0a9" Oct 14 13:28:10.610814 master-2 kubenswrapper[4762]: I1014 13:28:10.610633 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:28:10.675885 master-2 kubenswrapper[4762]: I1014 13:28:10.675827 4762 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:10.682972 master-2 kubenswrapper[4762]: I1014 13:28:10.682900 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:28:10.736784 master-2 kubenswrapper[4762]: I1014 13:28:10.736712 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:10.747933 master-2 kubenswrapper[4762]: I1014 13:28:10.747851 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-2"] Oct 14 13:28:10.770879 master-2 kubenswrapper[4762]: W1014 13:28:10.770773 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7826f9db5842f000a071fd58a1ae79.slice/crio-8465cfdf8b680b1aa2c8dddd02894113a5843ba89f925f4e5faebd9b63b56eca WatchSource:0}: Error finding container 8465cfdf8b680b1aa2c8dddd02894113a5843ba89f925f4e5faebd9b63b56eca: Status 404 returned error can't find the container with id 8465cfdf8b680b1aa2c8dddd02894113a5843ba89f925f4e5faebd9b63b56eca Oct 14 13:28:11.125298 master-2 kubenswrapper[4762]: I1014 13:28:11.125179 4762 generic.go:334] "Generic (PLEG): container finished" podID="cd7826f9db5842f000a071fd58a1ae79" containerID="e8d751d4c0a4fbc6a5f497bf0cdc20bd382f6dc825a80fb256fde95873c977fe" exitCode=0 Oct 14 13:28:11.125298 master-2 kubenswrapper[4762]: I1014 13:28:11.125266 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerDied","Data":"e8d751d4c0a4fbc6a5f497bf0cdc20bd382f6dc825a80fb256fde95873c977fe"} Oct 14 13:28:11.125298 master-2 kubenswrapper[4762]: I1014 13:28:11.125298 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"8465cfdf8b680b1aa2c8dddd02894113a5843ba89f925f4e5faebd9b63b56eca"} Oct 14 13:28:11.443724 master-2 kubenswrapper[4762]: E1014 13:28:11.443658 4762 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7826f9db5842f000a071fd58a1ae79.slice/crio-d7bd62486d1f56c8294660241b4d73d6e8bbfab2cac4c567a4e8884e5dd5086d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7826f9db5842f000a071fd58a1ae79.slice/crio-conmon-d7bd62486d1f56c8294660241b4d73d6e8bbfab2cac4c567a4e8884e5dd5086d.scope\": RecentStats: unable to find data in memory cache]" Oct 14 13:28:12.132658 master-2 kubenswrapper[4762]: I1014 13:28:12.132591 4762 generic.go:334] "Generic (PLEG): container finished" podID="cd7826f9db5842f000a071fd58a1ae79" containerID="d7bd62486d1f56c8294660241b4d73d6e8bbfab2cac4c567a4e8884e5dd5086d" exitCode=0 Oct 14 13:28:12.132658 master-2 kubenswrapper[4762]: I1014 13:28:12.132630 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerDied","Data":"d7bd62486d1f56c8294660241b4d73d6e8bbfab2cac4c567a4e8884e5dd5086d"} Oct 14 13:28:12.368418 master-2 kubenswrapper[4762]: I1014 13:28:12.368339 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:12.368617 master-2 kubenswrapper[4762]: I1014 13:28:12.368417 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:13.143373 master-2 kubenswrapper[4762]: I1014 13:28:13.143207 4762 generic.go:334] "Generic (PLEG): container finished" podID="cd7826f9db5842f000a071fd58a1ae79" containerID="2a9fbbf6ae68570979dbbfaf8fcc3749f050cd93c011dcc300ba4ab8c289647b" exitCode=0 Oct 14 13:28:13.143373 master-2 kubenswrapper[4762]: I1014 13:28:13.143274 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerDied","Data":"2a9fbbf6ae68570979dbbfaf8fcc3749f050cd93c011dcc300ba4ab8c289647b"} Oct 14 13:28:14.155737 master-2 kubenswrapper[4762]: I1014 13:28:14.155679 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"22698de173097b3c5507419feb9c65a355496d2e7e366a96f84f9206f6636cc5"} Oct 14 13:28:14.155737 master-2 kubenswrapper[4762]: I1014 13:28:14.155733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"c2d6123e9ac95ad7ca0b6be6b59ef76d36625e33f55148a92508d274beab443c"} Oct 14 13:28:14.156332 master-2 kubenswrapper[4762]: I1014 13:28:14.155747 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"85535e85cc54ced38ff4a1cc23de3e1c7638e4d88021ff9ec938dbc9e5cdd38c"} Oct 14 13:28:15.167372 master-2 kubenswrapper[4762]: I1014 13:28:15.167306 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"b9f6860bbf9bb1fc2b6e342cb9eca0f7eea8f26046658e91bf4eb69f72c4dcb4"} Oct 14 13:28:15.167372 master-2 kubenswrapper[4762]: I1014 13:28:15.167369 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-2" event={"ID":"cd7826f9db5842f000a071fd58a1ae79","Type":"ContainerStarted","Data":"817fbc5bbd2fa6aad2133cb2e2cd4eec5a1f0cf95e5b92ce73c34926bc5509b4"} Oct 14 13:28:15.263741 master-2 kubenswrapper[4762]: I1014 13:28:15.263638 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-2" podStartSLOduration=5.263606056 podStartE2EDuration="5.263606056s" podCreationTimestamp="2025-10-14 13:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:28:15.261020454 +0000 UTC m=+1324.505179673" watchObservedRunningTime="2025-10-14 13:28:15.263606056 +0000 UTC m=+1324.507765255" Oct 14 13:28:15.367059 master-2 kubenswrapper[4762]: I1014 13:28:15.367000 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-guard-master-2" Oct 14 13:28:15.669061 master-2 kubenswrapper[4762]: I1014 13:28:15.669002 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb"] Oct 14 13:28:15.670543 master-2 kubenswrapper[4762]: I1014 13:28:15.670504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.673866 master-2 kubenswrapper[4762]: I1014 13:28:15.673772 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-jvnh4" Oct 14 13:28:15.699803 master-2 kubenswrapper[4762]: I1014 13:28:15.699744 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb"] Oct 14 13:28:15.738233 master-2 kubenswrapper[4762]: I1014 13:28:15.737703 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:15.805533 master-2 kubenswrapper[4762]: I1014 13:28:15.805453 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.805754 master-2 kubenswrapper[4762]: I1014 13:28:15.805543 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.805754 master-2 kubenswrapper[4762]: I1014 13:28:15.805622 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2pn\" (UniqueName: \"kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.906954 master-2 kubenswrapper[4762]: I1014 13:28:15.906894 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.907264 master-2 kubenswrapper[4762]: I1014 13:28:15.906986 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.907264 master-2 kubenswrapper[4762]: I1014 13:28:15.907063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2pn\" (UniqueName: \"kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.907614 master-2 kubenswrapper[4762]: I1014 13:28:15.907564 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.907854 master-2 kubenswrapper[4762]: I1014 13:28:15.907626 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.930689 master-2 kubenswrapper[4762]: I1014 13:28:15.930571 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2pn\" (UniqueName: \"kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn\") pod \"4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:15.988330 master-2 kubenswrapper[4762]: I1014 13:28:15.988221 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:16.458126 master-2 kubenswrapper[4762]: I1014 13:28:16.458067 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb"] Oct 14 13:28:16.463174 master-2 kubenswrapper[4762]: W1014 13:28:16.463093 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddda5b0f1_8fbe_457b_8d4e_103be985f44d.slice/crio-a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae WatchSource:0}: Error finding container a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae: Status 404 returned error can't find the container with id a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae Oct 14 13:28:17.185404 master-2 kubenswrapper[4762]: I1014 13:28:17.185288 4762 generic.go:334] "Generic (PLEG): container finished" podID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerID="e187dd01c1f742fd1b0a70927d4f82f8ad464a7dc6053bd4f73eb4935f516605" exitCode=0 Oct 14 13:28:17.185730 master-2 kubenswrapper[4762]: I1014 13:28:17.185424 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" event={"ID":"dda5b0f1-8fbe-457b-8d4e-103be985f44d","Type":"ContainerDied","Data":"e187dd01c1f742fd1b0a70927d4f82f8ad464a7dc6053bd4f73eb4935f516605"} Oct 14 13:28:17.185730 master-2 kubenswrapper[4762]: I1014 13:28:17.185481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" event={"ID":"dda5b0f1-8fbe-457b-8d4e-103be985f44d","Type":"ContainerStarted","Data":"a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae"} Oct 14 13:28:17.187997 master-2 kubenswrapper[4762]: I1014 13:28:17.187623 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:28:17.368133 master-2 kubenswrapper[4762]: I1014 13:28:17.368056 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:17.368133 master-2 kubenswrapper[4762]: I1014 13:28:17.368133 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:19.203666 master-2 kubenswrapper[4762]: I1014 13:28:19.203604 4762 generic.go:334] "Generic (PLEG): container finished" podID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerID="a3c193cc9b6efa1dcb98f2d1a8606c618d49b5eb636f79af7fa6bd5dcdedadc8" exitCode=0 Oct 14 13:28:19.204344 master-2 kubenswrapper[4762]: I1014 13:28:19.203664 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" event={"ID":"dda5b0f1-8fbe-457b-8d4e-103be985f44d","Type":"ContainerDied","Data":"a3c193cc9b6efa1dcb98f2d1a8606c618d49b5eb636f79af7fa6bd5dcdedadc8"} Oct 14 13:28:20.214125 master-2 kubenswrapper[4762]: I1014 13:28:20.214052 4762 generic.go:334] "Generic (PLEG): container finished" podID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerID="e3c7988846ac86571c7fc59dc4776c1e4a9fec6b75a2185753e693a305f9bb96" exitCode=0 Oct 14 13:28:20.215064 master-2 kubenswrapper[4762]: I1014 13:28:20.214230 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" event={"ID":"dda5b0f1-8fbe-457b-8d4e-103be985f44d","Type":"ContainerDied","Data":"e3c7988846ac86571c7fc59dc4776c1e4a9fec6b75a2185753e693a305f9bb96"} Oct 14 13:28:20.737424 master-2 kubenswrapper[4762]: I1014 13:28:20.737372 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:21.612081 master-2 kubenswrapper[4762]: I1014 13:28:21.612045 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:21.708591 master-2 kubenswrapper[4762]: I1014 13:28:21.708522 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util\") pod \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " Oct 14 13:28:21.708797 master-2 kubenswrapper[4762]: I1014 13:28:21.708704 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle\") pod \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " Oct 14 13:28:21.708797 master-2 kubenswrapper[4762]: I1014 13:28:21.708755 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2pn\" (UniqueName: \"kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn\") pod \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\" (UID: \"dda5b0f1-8fbe-457b-8d4e-103be985f44d\") " Oct 14 13:28:21.709839 master-2 kubenswrapper[4762]: I1014 13:28:21.709778 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle" (OuterVolumeSpecName: "bundle") pod "dda5b0f1-8fbe-457b-8d4e-103be985f44d" (UID: "dda5b0f1-8fbe-457b-8d4e-103be985f44d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:21.711995 master-2 kubenswrapper[4762]: I1014 13:28:21.711891 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn" (OuterVolumeSpecName: "kube-api-access-wr2pn") pod "dda5b0f1-8fbe-457b-8d4e-103be985f44d" (UID: "dda5b0f1-8fbe-457b-8d4e-103be985f44d"). InnerVolumeSpecName "kube-api-access-wr2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:21.740686 master-2 kubenswrapper[4762]: I1014 13:28:21.740578 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util" (OuterVolumeSpecName: "util") pod "dda5b0f1-8fbe-457b-8d4e-103be985f44d" (UID: "dda5b0f1-8fbe-457b-8d4e-103be985f44d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:21.810661 master-2 kubenswrapper[4762]: I1014 13:28:21.810530 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:21.810661 master-2 kubenswrapper[4762]: I1014 13:28:21.810577 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dda5b0f1-8fbe-457b-8d4e-103be985f44d-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:21.810661 master-2 kubenswrapper[4762]: I1014 13:28:21.810595 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2pn\" (UniqueName: \"kubernetes.io/projected/dda5b0f1-8fbe-457b-8d4e-103be985f44d-kube-api-access-wr2pn\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:22.234663 master-2 kubenswrapper[4762]: I1014 13:28:22.234613 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" event={"ID":"dda5b0f1-8fbe-457b-8d4e-103be985f44d","Type":"ContainerDied","Data":"a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae"} Oct 14 13:28:22.234663 master-2 kubenswrapper[4762]: I1014 13:28:22.234661 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04437c31abc00e4a2d55a517c1094b5a52ce60dcfba34fd6871bee6b87eadae" Oct 14 13:28:22.234970 master-2 kubenswrapper[4762]: I1014 13:28:22.234956 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4771432b41461e44875d05f444712a00e992d5a0d93af947c146bd94b725qlb" Oct 14 13:28:22.367845 master-2 kubenswrapper[4762]: I1014 13:28:22.367750 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:22.368082 master-2 kubenswrapper[4762]: I1014 13:28:22.367848 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:27.368607 master-2 kubenswrapper[4762]: I1014 13:28:27.368502 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:27.368607 master-2 kubenswrapper[4762]: I1014 13:28:27.368590 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:30.756311 master-2 kubenswrapper[4762]: I1014 13:28:30.756235 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:30.771432 master-2 kubenswrapper[4762]: I1014 13:28:30.771373 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-2" Oct 14 13:28:32.369664 master-2 kubenswrapper[4762]: I1014 13:28:32.369557 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:32.369664 master-2 kubenswrapper[4762]: I1014 13:28:32.369636 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:37.367839 master-2 kubenswrapper[4762]: I1014 13:28:37.367790 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:37.368539 master-2 kubenswrapper[4762]: I1014 13:28:37.367869 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:42.368125 master-2 kubenswrapper[4762]: I1014 13:28:42.367996 4762 patch_prober.go:28] interesting pod/apiserver-5f68d4c887-s2fvb container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" start-of-body= Oct 14 13:28:42.368125 master-2 kubenswrapper[4762]: I1014 13:28:42.368074 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.129.0.68:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.129.0.68:8443: connect: connection refused" Oct 14 13:28:42.391357 master-2 kubenswrapper[4762]: I1014 13:28:42.391313 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx"] Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: E1014 13:28:42.391531 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="extract" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: I1014 13:28:42.391546 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="extract" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: E1014 13:28:42.391562 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="pull" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: I1014 13:28:42.391571 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="pull" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: E1014 13:28:42.391595 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="util" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: I1014 13:28:42.391603 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="util" Oct 14 13:28:42.391827 master-2 kubenswrapper[4762]: I1014 13:28:42.391727 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda5b0f1-8fbe-457b-8d4e-103be985f44d" containerName="extract" Oct 14 13:28:42.392743 master-2 kubenswrapper[4762]: I1014 13:28:42.392712 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.395556 master-2 kubenswrapper[4762]: I1014 13:28:42.395479 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-jvnh4" Oct 14 13:28:42.426297 master-2 kubenswrapper[4762]: I1014 13:28:42.426207 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx"] Oct 14 13:28:42.457389 master-2 kubenswrapper[4762]: I1014 13:28:42.457258 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9fk\" (UniqueName: \"kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.457389 master-2 kubenswrapper[4762]: I1014 13:28:42.457366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.457792 master-2 kubenswrapper[4762]: I1014 13:28:42.457481 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.559480 master-2 kubenswrapper[4762]: I1014 13:28:42.559393 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9fk\" (UniqueName: \"kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.559480 master-2 kubenswrapper[4762]: I1014 13:28:42.559470 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.559919 master-2 kubenswrapper[4762]: I1014 13:28:42.559541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.560581 master-2 kubenswrapper[4762]: I1014 13:28:42.560303 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.560581 master-2 kubenswrapper[4762]: I1014 13:28:42.560356 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.593556 master-2 kubenswrapper[4762]: I1014 13:28:42.593491 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9fk\" (UniqueName: \"kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.714938 master-2 kubenswrapper[4762]: I1014 13:28:42.714861 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:42.823713 master-2 kubenswrapper[4762]: I1014 13:28:42.823639 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/revision-pruner-10-master-2"] Oct 14 13:28:42.827557 master-2 kubenswrapper[4762]: I1014 13:28:42.824935 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.829050 master-2 kubenswrapper[4762]: I1014 13:28:42.828975 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-xbs2c" Oct 14 13:28:42.843458 master-2 kubenswrapper[4762]: I1014 13:28:42.843404 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/revision-pruner-10-master-2"] Oct 14 13:28:42.865664 master-2 kubenswrapper[4762]: I1014 13:28:42.865442 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.865664 master-2 kubenswrapper[4762]: I1014 13:28:42.865522 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.967659 master-2 kubenswrapper[4762]: I1014 13:28:42.967504 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.967659 master-2 kubenswrapper[4762]: I1014 13:28:42.967575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.967958 master-2 kubenswrapper[4762]: I1014 13:28:42.967689 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:42.988601 master-2 kubenswrapper[4762]: I1014 13:28:42.988542 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access\") pod \"revision-pruner-10-master-2\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:43.175292 master-2 kubenswrapper[4762]: I1014 13:28:43.175203 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:43.218761 master-2 kubenswrapper[4762]: I1014 13:28:43.218621 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx"] Oct 14 13:28:43.230793 master-2 kubenswrapper[4762]: W1014 13:28:43.230716 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c2a707_93be_43ba_9124_1829a7a845ea.slice/crio-b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e WatchSource:0}: Error finding container b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e: Status 404 returned error can't find the container with id b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e Oct 14 13:28:43.401221 master-2 kubenswrapper[4762]: I1014 13:28:43.401113 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerStarted","Data":"3ee0796b0fd29fde045d236da9c3a5f42817969141400e79b9de5d5313ccb6ac"} Oct 14 13:28:43.401221 master-2 kubenswrapper[4762]: I1014 13:28:43.401181 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerStarted","Data":"b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e"} Oct 14 13:28:43.655826 master-2 kubenswrapper[4762]: I1014 13:28:43.655748 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/revision-pruner-10-master-2"] Oct 14 13:28:43.666416 master-2 kubenswrapper[4762]: W1014 13:28:43.666346 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda0b3b3d9_4cd5_4cf5_93b6_9480f7636efe.slice/crio-36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b WatchSource:0}: Error finding container 36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b: Status 404 returned error can't find the container with id 36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b Oct 14 13:28:44.373522 master-2 kubenswrapper[4762]: I1014 13:28:44.373375 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m"] Oct 14 13:28:44.375558 master-2 kubenswrapper[4762]: I1014 13:28:44.375489 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.381674 master-2 kubenswrapper[4762]: I1014 13:28:44.381571 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m"] Oct 14 13:28:44.411228 master-2 kubenswrapper[4762]: I1014 13:28:44.411181 4762 generic.go:334] "Generic (PLEG): container finished" podID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerID="3ee0796b0fd29fde045d236da9c3a5f42817969141400e79b9de5d5313ccb6ac" exitCode=0 Oct 14 13:28:44.411938 master-2 kubenswrapper[4762]: I1014 13:28:44.411882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerDied","Data":"3ee0796b0fd29fde045d236da9c3a5f42817969141400e79b9de5d5313ccb6ac"} Oct 14 13:28:44.414571 master-2 kubenswrapper[4762]: I1014 13:28:44.414498 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-2" event={"ID":"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe","Type":"ContainerStarted","Data":"c4a1b4f742f14c2b102947e47293a95a074527d0b6d084135b0c4e7e611eaf9a"} Oct 14 13:28:44.414799 master-2 kubenswrapper[4762]: I1014 13:28:44.414773 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-2" event={"ID":"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe","Type":"ContainerStarted","Data":"36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b"} Oct 14 13:28:44.489004 master-2 kubenswrapper[4762]: I1014 13:28:44.488922 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.489300 master-2 kubenswrapper[4762]: I1014 13:28:44.489111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqnd\" (UniqueName: \"kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.489300 master-2 kubenswrapper[4762]: I1014 13:28:44.489263 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.496872 master-2 kubenswrapper[4762]: I1014 13:28:44.496677 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/revision-pruner-10-master-2" podStartSLOduration=2.4966492049999998 podStartE2EDuration="2.496649205s" podCreationTimestamp="2025-10-14 13:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:28:44.49300444 +0000 UTC m=+1353.737163599" watchObservedRunningTime="2025-10-14 13:28:44.496649205 +0000 UTC m=+1353.740808374" Oct 14 13:28:44.593293 master-2 kubenswrapper[4762]: I1014 13:28:44.591721 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.593293 master-2 kubenswrapper[4762]: I1014 13:28:44.591837 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqnd\" (UniqueName: \"kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.593293 master-2 kubenswrapper[4762]: I1014 13:28:44.591888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.594054 master-2 kubenswrapper[4762]: I1014 13:28:44.593905 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.594054 master-2 kubenswrapper[4762]: I1014 13:28:44.593995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.614390 master-2 kubenswrapper[4762]: I1014 13:28:44.614338 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqnd\" (UniqueName: \"kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:44.710234 master-2 kubenswrapper[4762]: I1014 13:28:44.709801 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:45.048784 master-2 kubenswrapper[4762]: I1014 13:28:45.048714 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-3-master-2"] Oct 14 13:28:45.053634 master-2 kubenswrapper[4762]: I1014 13:28:45.053591 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-3-master-2"] Oct 14 13:28:45.116240 master-2 kubenswrapper[4762]: I1014 13:28:45.115780 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m"] Oct 14 13:28:45.160356 master-2 kubenswrapper[4762]: I1014 13:28:45.158681 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:28:45.161715 master-2 kubenswrapper[4762]: I1014 13:28:45.161660 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb"] Oct 14 13:28:45.161962 master-2 kubenswrapper[4762]: E1014 13:28:45.161936 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver-check-endpoints" Oct 14 13:28:45.161962 master-2 kubenswrapper[4762]: I1014 13:28:45.161956 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver-check-endpoints" Oct 14 13:28:45.162034 master-2 kubenswrapper[4762]: E1014 13:28:45.161970 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" Oct 14 13:28:45.162034 master-2 kubenswrapper[4762]: I1014 13:28:45.161978 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" Oct 14 13:28:45.162034 master-2 kubenswrapper[4762]: E1014 13:28:45.161990 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="fix-audit-permissions" Oct 14 13:28:45.162034 master-2 kubenswrapper[4762]: I1014 13:28:45.161999 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="fix-audit-permissions" Oct 14 13:28:45.162509 master-2 kubenswrapper[4762]: I1014 13:28:45.162128 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver" Oct 14 13:28:45.162509 master-2 kubenswrapper[4762]: I1014 13:28:45.162138 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerName="openshift-apiserver-check-endpoints" Oct 14 13:28:45.163175 master-2 kubenswrapper[4762]: I1014 13:28:45.163128 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.176939 master-2 kubenswrapper[4762]: I1014 13:28:45.176892 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb"] Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201556 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201607 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201693 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp4xc\" (UniqueName: \"kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201718 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201736 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201766 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201790 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201810 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201850 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201872 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.201927 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca\") pod \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\" (UID: \"66ea399c-a47a-41dd-91c2-cceaa9eca5bc\") " Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.202010 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.202177 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.202216 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.202241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wqp\" (UniqueName: \"kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.202478 master-2 kubenswrapper[4762]: I1014 13:28:45.202306 4762 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-node-pullsecrets\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.203314 master-2 kubenswrapper[4762]: I1014 13:28:45.202688 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:45.203314 master-2 kubenswrapper[4762]: I1014 13:28:45.202761 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit" (OuterVolumeSpecName: "audit") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:28:45.203314 master-2 kubenswrapper[4762]: I1014 13:28:45.202781 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.203657 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.203995 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.208064 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config" (OuterVolumeSpecName: "config") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.208477 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.208617 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:28:45.210506 master-2 kubenswrapper[4762]: I1014 13:28:45.209285 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc" (OuterVolumeSpecName: "kube-api-access-fp4xc") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "kube-api-access-fp4xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:45.217249 master-2 kubenswrapper[4762]: I1014 13:28:45.214935 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "66ea399c-a47a-41dd-91c2-cceaa9eca5bc" (UID: "66ea399c-a47a-41dd-91c2-cceaa9eca5bc"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:28:45.304728 master-2 kubenswrapper[4762]: I1014 13:28:45.304628 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.304728 master-2 kubenswrapper[4762]: I1014 13:28:45.304729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304773 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wqp\" (UniqueName: \"kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304896 4762 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304917 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-serving-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304937 4762 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-image-import-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304948 4762 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-encryption-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304961 4762 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304977 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp4xc\" (UniqueName: \"kubernetes.io/projected/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-kube-api-access-fp4xc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.304989 4762 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-etcd-client\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.305000 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.305011 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305086 master-2 kubenswrapper[4762]: I1014 13:28:45.305025 4762 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ea399c-a47a-41dd-91c2-cceaa9eca5bc-audit-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:45.305844 master-2 kubenswrapper[4762]: I1014 13:28:45.305753 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.306117 master-2 kubenswrapper[4762]: I1014 13:28:45.306070 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.330949 master-2 kubenswrapper[4762]: I1014 13:28:45.330888 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wqp\" (UniqueName: \"kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.422985 master-2 kubenswrapper[4762]: I1014 13:28:45.422905 4762 generic.go:334] "Generic (PLEG): container finished" podID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" containerID="e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d" exitCode=0 Oct 14 13:28:45.422985 master-2 kubenswrapper[4762]: I1014 13:28:45.422959 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerDied","Data":"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d"} Oct 14 13:28:45.423827 master-2 kubenswrapper[4762]: I1014 13:28:45.423006 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" event={"ID":"66ea399c-a47a-41dd-91c2-cceaa9eca5bc","Type":"ContainerDied","Data":"48bf4594f83f895ff84f955852c223ad410d2d6529063ce059e62b67cab1e8b4"} Oct 14 13:28:45.423827 master-2 kubenswrapper[4762]: I1014 13:28:45.423030 4762 scope.go:117] "RemoveContainer" containerID="d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a" Oct 14 13:28:45.423827 master-2 kubenswrapper[4762]: I1014 13:28:45.423045 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f68d4c887-s2fvb" Oct 14 13:28:45.425895 master-2 kubenswrapper[4762]: I1014 13:28:45.425852 4762 generic.go:334] "Generic (PLEG): container finished" podID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerID="b80117648796c0a645683420e0b160a9d8cb9925a8fd7d271c81e515fd69d940" exitCode=0 Oct 14 13:28:45.425999 master-2 kubenswrapper[4762]: I1014 13:28:45.425923 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" event={"ID":"f09ac26b-8035-43e5-9f90-667cae1a28ad","Type":"ContainerDied","Data":"b80117648796c0a645683420e0b160a9d8cb9925a8fd7d271c81e515fd69d940"} Oct 14 13:28:45.425999 master-2 kubenswrapper[4762]: I1014 13:28:45.425948 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" event={"ID":"f09ac26b-8035-43e5-9f90-667cae1a28ad","Type":"ContainerStarted","Data":"80bd93938d057fbfa13478bb100eab8c0497a08709a25f02114fbe8de71359cc"} Oct 14 13:28:45.427834 master-2 kubenswrapper[4762]: I1014 13:28:45.427694 4762 generic.go:334] "Generic (PLEG): container finished" podID="a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" containerID="c4a1b4f742f14c2b102947e47293a95a074527d0b6d084135b0c4e7e611eaf9a" exitCode=0 Oct 14 13:28:45.427834 master-2 kubenswrapper[4762]: I1014 13:28:45.427745 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-2" event={"ID":"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe","Type":"ContainerDied","Data":"c4a1b4f742f14c2b102947e47293a95a074527d0b6d084135b0c4e7e611eaf9a"} Oct 14 13:28:45.439440 master-2 kubenswrapper[4762]: I1014 13:28:45.439386 4762 scope.go:117] "RemoveContainer" containerID="e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d" Oct 14 13:28:45.454212 master-2 kubenswrapper[4762]: I1014 13:28:45.454120 4762 scope.go:117] "RemoveContainer" containerID="4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c" Oct 14 13:28:45.475211 master-2 kubenswrapper[4762]: I1014 13:28:45.475129 4762 scope.go:117] "RemoveContainer" containerID="d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a" Oct 14 13:28:45.477359 master-2 kubenswrapper[4762]: E1014 13:28:45.477281 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a\": container with ID starting with d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a not found: ID does not exist" containerID="d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a" Oct 14 13:28:45.477359 master-2 kubenswrapper[4762]: I1014 13:28:45.477336 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a"} err="failed to get container status \"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a\": rpc error: code = NotFound desc = could not find container \"d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a\": container with ID starting with d89425db738bd6749394bbca910f6e699c0f8f09d446d167a2fb2d243190660a not found: ID does not exist" Oct 14 13:28:45.477359 master-2 kubenswrapper[4762]: I1014 13:28:45.477362 4762 scope.go:117] "RemoveContainer" containerID="e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d" Oct 14 13:28:45.477981 master-2 kubenswrapper[4762]: E1014 13:28:45.477893 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d\": container with ID starting with e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d not found: ID does not exist" containerID="e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d" Oct 14 13:28:45.478071 master-2 kubenswrapper[4762]: I1014 13:28:45.477984 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d"} err="failed to get container status \"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d\": rpc error: code = NotFound desc = could not find container \"e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d\": container with ID starting with e7c105b3a9512339a854f05952e24e3ae0a7fe244eabe4dbb4a9cda51aea016d not found: ID does not exist" Oct 14 13:28:45.478071 master-2 kubenswrapper[4762]: I1014 13:28:45.478029 4762 scope.go:117] "RemoveContainer" containerID="4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c" Oct 14 13:28:45.478966 master-2 kubenswrapper[4762]: E1014 13:28:45.478834 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c\": container with ID starting with 4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c not found: ID does not exist" containerID="4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c" Oct 14 13:28:45.479119 master-2 kubenswrapper[4762]: I1014 13:28:45.478989 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c"} err="failed to get container status \"4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c\": rpc error: code = NotFound desc = could not find container \"4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c\": container with ID starting with 4957b14c39a7a14c5468856d5c404d05fdd96e3a24d504246d65efab9453120c not found: ID does not exist" Oct 14 13:28:45.495342 master-2 kubenswrapper[4762]: I1014 13:28:45.495197 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:28:45.502937 master-2 kubenswrapper[4762]: I1014 13:28:45.502865 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:45.503328 master-2 kubenswrapper[4762]: I1014 13:28:45.503271 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-5f68d4c887-s2fvb"] Oct 14 13:28:45.555952 master-2 kubenswrapper[4762]: I1014 13:28:45.555899 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ea399c-a47a-41dd-91c2-cceaa9eca5bc" path="/var/lib/kubelet/pods/66ea399c-a47a-41dd-91c2-cceaa9eca5bc/volumes" Oct 14 13:28:45.556700 master-2 kubenswrapper[4762]: I1014 13:28:45.556668 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4898af-987c-42c5-8728-033c5ede3e0f" path="/var/lib/kubelet/pods/ff4898af-987c-42c5-8728-033c5ede3e0f/volumes" Oct 14 13:28:45.906699 master-2 kubenswrapper[4762]: W1014 13:28:45.906626 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255dc145_a476_412e_b14d_bda046fb2528.slice/crio-9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7 WatchSource:0}: Error finding container 9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7: Status 404 returned error can't find the container with id 9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7 Oct 14 13:28:45.908825 master-2 kubenswrapper[4762]: I1014 13:28:45.908756 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb"] Oct 14 13:28:46.435770 master-2 kubenswrapper[4762]: I1014 13:28:46.435705 4762 generic.go:334] "Generic (PLEG): container finished" podID="255dc145-a476-412e-b14d-bda046fb2528" containerID="6b8234b716b84365f408c2c29c477b154ef760948c58d70a3985d92d2ce0ba39" exitCode=0 Oct 14 13:28:46.435770 master-2 kubenswrapper[4762]: I1014 13:28:46.435786 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" event={"ID":"255dc145-a476-412e-b14d-bda046fb2528","Type":"ContainerDied","Data":"6b8234b716b84365f408c2c29c477b154ef760948c58d70a3985d92d2ce0ba39"} Oct 14 13:28:46.435770 master-2 kubenswrapper[4762]: I1014 13:28:46.435817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" event={"ID":"255dc145-a476-412e-b14d-bda046fb2528","Type":"ContainerStarted","Data":"9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7"} Oct 14 13:28:47.069252 master-2 kubenswrapper[4762]: I1014 13:28:47.069163 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:47.143638 master-2 kubenswrapper[4762]: I1014 13:28:47.143567 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir\") pod \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " Oct 14 13:28:47.144017 master-2 kubenswrapper[4762]: I1014 13:28:47.143739 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" (UID: "a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:28:47.144017 master-2 kubenswrapper[4762]: I1014 13:28:47.143764 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access\") pod \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\" (UID: \"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe\") " Oct 14 13:28:47.144521 master-2 kubenswrapper[4762]: I1014 13:28:47.144479 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:47.148850 master-2 kubenswrapper[4762]: I1014 13:28:47.148786 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" (UID: "a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:47.246463 master-2 kubenswrapper[4762]: I1014 13:28:47.246374 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:47.448190 master-2 kubenswrapper[4762]: I1014 13:28:47.448064 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/revision-pruner-10-master-2" Oct 14 13:28:47.448190 master-2 kubenswrapper[4762]: I1014 13:28:47.448068 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/revision-pruner-10-master-2" event={"ID":"a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe","Type":"ContainerDied","Data":"36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b"} Oct 14 13:28:47.449130 master-2 kubenswrapper[4762]: I1014 13:28:47.448232 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36d46c3ebc88d31b4ce039fd2b67951baf8fc9d4bd0d91c397a59820fc1ea59b" Oct 14 13:28:47.451467 master-2 kubenswrapper[4762]: I1014 13:28:47.451413 4762 generic.go:334] "Generic (PLEG): container finished" podID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerID="38c2e49331a73bd09cc306d1056b7769d7dcf465fed7fd302c3003c2958d6644" exitCode=0 Oct 14 13:28:47.451467 master-2 kubenswrapper[4762]: I1014 13:28:47.451464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerDied","Data":"38c2e49331a73bd09cc306d1056b7769d7dcf465fed7fd302c3003c2958d6644"} Oct 14 13:28:48.463799 master-2 kubenswrapper[4762]: I1014 13:28:48.463675 4762 generic.go:334] "Generic (PLEG): container finished" podID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerID="d24620d42809fad365bb4bef6385b5345bafbcab6f42f0e72e54a002d02b3236" exitCode=0 Oct 14 13:28:48.464907 master-2 kubenswrapper[4762]: I1014 13:28:48.463837 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerDied","Data":"d24620d42809fad365bb4bef6385b5345bafbcab6f42f0e72e54a002d02b3236"} Oct 14 13:28:48.466905 master-2 kubenswrapper[4762]: I1014 13:28:48.466875 4762 generic.go:334] "Generic (PLEG): container finished" podID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerID="b9ae67c291d93bbeef60596e80e9e5ba51d2d255278dd173ef1ab7a8f9c9565e" exitCode=0 Oct 14 13:28:48.466983 master-2 kubenswrapper[4762]: I1014 13:28:48.466934 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" event={"ID":"f09ac26b-8035-43e5-9f90-667cae1a28ad","Type":"ContainerDied","Data":"b9ae67c291d93bbeef60596e80e9e5ba51d2d255278dd173ef1ab7a8f9c9565e"} Oct 14 13:28:48.471762 master-2 kubenswrapper[4762]: I1014 13:28:48.471713 4762 generic.go:334] "Generic (PLEG): container finished" podID="255dc145-a476-412e-b14d-bda046fb2528" containerID="42cc65264fb80a46348fa50efd66a5ee5660bf96d7addf7ce012deee66c41019" exitCode=0 Oct 14 13:28:48.471889 master-2 kubenswrapper[4762]: I1014 13:28:48.471762 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" event={"ID":"255dc145-a476-412e-b14d-bda046fb2528","Type":"ContainerDied","Data":"42cc65264fb80a46348fa50efd66a5ee5660bf96d7addf7ce012deee66c41019"} Oct 14 13:28:49.481189 master-2 kubenswrapper[4762]: I1014 13:28:49.481100 4762 generic.go:334] "Generic (PLEG): container finished" podID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerID="08e2c87150d5dc4bedb15fbc1f378a691e9ec83ee29232eb1e2dabe5d71004bf" exitCode=0 Oct 14 13:28:49.481812 master-2 kubenswrapper[4762]: I1014 13:28:49.481209 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" event={"ID":"f09ac26b-8035-43e5-9f90-667cae1a28ad","Type":"ContainerDied","Data":"08e2c87150d5dc4bedb15fbc1f378a691e9ec83ee29232eb1e2dabe5d71004bf"} Oct 14 13:28:49.484539 master-2 kubenswrapper[4762]: I1014 13:28:49.484498 4762 generic.go:334] "Generic (PLEG): container finished" podID="255dc145-a476-412e-b14d-bda046fb2528" containerID="d3a996f68023fb3917be3fd5b394341ea428239d24ba16593ee0706b30d1707f" exitCode=0 Oct 14 13:28:49.484709 master-2 kubenswrapper[4762]: I1014 13:28:49.484616 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" event={"ID":"255dc145-a476-412e-b14d-bda046fb2528","Type":"ContainerDied","Data":"d3a996f68023fb3917be3fd5b394341ea428239d24ba16593ee0706b30d1707f"} Oct 14 13:28:49.827050 master-2 kubenswrapper[4762]: I1014 13:28:49.826988 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:49.889997 master-2 kubenswrapper[4762]: I1014 13:28:49.885197 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw9fk\" (UniqueName: \"kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk\") pod \"a2c2a707-93be-43ba-9124-1829a7a845ea\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " Oct 14 13:28:49.889997 master-2 kubenswrapper[4762]: I1014 13:28:49.885315 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util\") pod \"a2c2a707-93be-43ba-9124-1829a7a845ea\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " Oct 14 13:28:49.889997 master-2 kubenswrapper[4762]: I1014 13:28:49.885445 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle\") pod \"a2c2a707-93be-43ba-9124-1829a7a845ea\" (UID: \"a2c2a707-93be-43ba-9124-1829a7a845ea\") " Oct 14 13:28:49.889997 master-2 kubenswrapper[4762]: I1014 13:28:49.887126 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle" (OuterVolumeSpecName: "bundle") pod "a2c2a707-93be-43ba-9124-1829a7a845ea" (UID: "a2c2a707-93be-43ba-9124-1829a7a845ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:49.889997 master-2 kubenswrapper[4762]: I1014 13:28:49.889654 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk" (OuterVolumeSpecName: "kube-api-access-rw9fk") pod "a2c2a707-93be-43ba-9124-1829a7a845ea" (UID: "a2c2a707-93be-43ba-9124-1829a7a845ea"). InnerVolumeSpecName "kube-api-access-rw9fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:49.901215 master-2 kubenswrapper[4762]: I1014 13:28:49.900743 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util" (OuterVolumeSpecName: "util") pod "a2c2a707-93be-43ba-9124-1829a7a845ea" (UID: "a2c2a707-93be-43ba-9124-1829a7a845ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:49.988024 master-2 kubenswrapper[4762]: I1014 13:28:49.987963 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:49.988365 master-2 kubenswrapper[4762]: I1014 13:28:49.988347 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw9fk\" (UniqueName: \"kubernetes.io/projected/a2c2a707-93be-43ba-9124-1829a7a845ea-kube-api-access-rw9fk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:49.988465 master-2 kubenswrapper[4762]: I1014 13:28:49.988453 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2c2a707-93be-43ba-9124-1829a7a845ea-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:50.150520 master-2 kubenswrapper[4762]: I1014 13:28:50.150361 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn"] Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: E1014 13:28:50.150667 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" containerName="pruner" Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: I1014 13:28:50.150689 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" containerName="pruner" Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: E1014 13:28:50.150716 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="pull" Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: I1014 13:28:50.150729 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="pull" Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: E1014 13:28:50.150759 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="util" Oct 14 13:28:50.150775 master-2 kubenswrapper[4762]: I1014 13:28:50.150772 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="util" Oct 14 13:28:50.151207 master-2 kubenswrapper[4762]: E1014 13:28:50.150788 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="extract" Oct 14 13:28:50.151207 master-2 kubenswrapper[4762]: I1014 13:28:50.150800 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="extract" Oct 14 13:28:50.151207 master-2 kubenswrapper[4762]: I1014 13:28:50.150961 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe" containerName="pruner" Oct 14 13:28:50.151207 master-2 kubenswrapper[4762]: I1014 13:28:50.150989 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c2a707-93be-43ba-9124-1829a7a845ea" containerName="extract" Oct 14 13:28:50.152463 master-2 kubenswrapper[4762]: I1014 13:28:50.152428 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.165190 master-2 kubenswrapper[4762]: I1014 13:28:50.165095 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn"] Oct 14 13:28:50.192474 master-2 kubenswrapper[4762]: I1014 13:28:50.192392 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.192757 master-2 kubenswrapper[4762]: I1014 13:28:50.192686 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64n8b\" (UniqueName: \"kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.192757 master-2 kubenswrapper[4762]: I1014 13:28:50.192741 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.294069 master-2 kubenswrapper[4762]: I1014 13:28:50.293988 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64n8b\" (UniqueName: \"kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.294069 master-2 kubenswrapper[4762]: I1014 13:28:50.294052 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.294640 master-2 kubenswrapper[4762]: I1014 13:28:50.294109 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.294872 master-2 kubenswrapper[4762]: I1014 13:28:50.294795 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.295031 master-2 kubenswrapper[4762]: I1014 13:28:50.294977 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.325787 master-2 kubenswrapper[4762]: I1014 13:28:50.325684 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64n8b\" (UniqueName: \"kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.481890 master-2 kubenswrapper[4762]: I1014 13:28:50.481770 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:50.495089 master-2 kubenswrapper[4762]: I1014 13:28:50.494999 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" event={"ID":"a2c2a707-93be-43ba-9124-1829a7a845ea","Type":"ContainerDied","Data":"b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e"} Oct 14 13:28:50.495089 master-2 kubenswrapper[4762]: I1014 13:28:50.495073 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03e6e42694bd496c13f64b32e10e1762e57f41ba3e6ffbfdcf0dcd758b6565e" Oct 14 13:28:50.495441 master-2 kubenswrapper[4762]: I1014 13:28:50.495122 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb699lfcx" Oct 14 13:28:50.988818 master-2 kubenswrapper[4762]: I1014 13:28:50.988708 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:50.994692 master-2 kubenswrapper[4762]: I1014 13:28:50.994219 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:51.085882 master-2 kubenswrapper[4762]: I1014 13:28:51.085832 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn"] Oct 14 13:28:51.105175 master-2 kubenswrapper[4762]: I1014 13:28:51.105092 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqnd\" (UniqueName: \"kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd\") pod \"f09ac26b-8035-43e5-9f90-667cae1a28ad\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " Oct 14 13:28:51.105361 master-2 kubenswrapper[4762]: I1014 13:28:51.105181 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util\") pod \"f09ac26b-8035-43e5-9f90-667cae1a28ad\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " Oct 14 13:28:51.105361 master-2 kubenswrapper[4762]: I1014 13:28:51.105262 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2wqp\" (UniqueName: \"kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp\") pod \"255dc145-a476-412e-b14d-bda046fb2528\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " Oct 14 13:28:51.105361 master-2 kubenswrapper[4762]: I1014 13:28:51.105306 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle\") pod \"255dc145-a476-412e-b14d-bda046fb2528\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " Oct 14 13:28:51.105361 master-2 kubenswrapper[4762]: I1014 13:28:51.105331 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util\") pod \"255dc145-a476-412e-b14d-bda046fb2528\" (UID: \"255dc145-a476-412e-b14d-bda046fb2528\") " Oct 14 13:28:51.105361 master-2 kubenswrapper[4762]: I1014 13:28:51.105352 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle\") pod \"f09ac26b-8035-43e5-9f90-667cae1a28ad\" (UID: \"f09ac26b-8035-43e5-9f90-667cae1a28ad\") " Oct 14 13:28:51.105968 master-2 kubenswrapper[4762]: I1014 13:28:51.105905 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle" (OuterVolumeSpecName: "bundle") pod "255dc145-a476-412e-b14d-bda046fb2528" (UID: "255dc145-a476-412e-b14d-bda046fb2528"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:51.106799 master-2 kubenswrapper[4762]: I1014 13:28:51.106744 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle" (OuterVolumeSpecName: "bundle") pod "f09ac26b-8035-43e5-9f90-667cae1a28ad" (UID: "f09ac26b-8035-43e5-9f90-667cae1a28ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:51.109476 master-2 kubenswrapper[4762]: I1014 13:28:51.109389 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd" (OuterVolumeSpecName: "kube-api-access-9pqnd") pod "f09ac26b-8035-43e5-9f90-667cae1a28ad" (UID: "f09ac26b-8035-43e5-9f90-667cae1a28ad"). InnerVolumeSpecName "kube-api-access-9pqnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:51.109981 master-2 kubenswrapper[4762]: I1014 13:28:51.109934 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp" (OuterVolumeSpecName: "kube-api-access-v2wqp") pod "255dc145-a476-412e-b14d-bda046fb2528" (UID: "255dc145-a476-412e-b14d-bda046fb2528"). InnerVolumeSpecName "kube-api-access-v2wqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:51.117055 master-2 kubenswrapper[4762]: I1014 13:28:51.116980 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util" (OuterVolumeSpecName: "util") pod "f09ac26b-8035-43e5-9f90-667cae1a28ad" (UID: "f09ac26b-8035-43e5-9f90-667cae1a28ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:51.207518 master-2 kubenswrapper[4762]: I1014 13:28:51.207438 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.207518 master-2 kubenswrapper[4762]: I1014 13:28:51.207499 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2wqp\" (UniqueName: \"kubernetes.io/projected/255dc145-a476-412e-b14d-bda046fb2528-kube-api-access-v2wqp\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.207518 master-2 kubenswrapper[4762]: I1014 13:28:51.207525 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.207933 master-2 kubenswrapper[4762]: I1014 13:28:51.207542 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f09ac26b-8035-43e5-9f90-667cae1a28ad-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.207933 master-2 kubenswrapper[4762]: I1014 13:28:51.207562 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqnd\" (UniqueName: \"kubernetes.io/projected/f09ac26b-8035-43e5-9f90-667cae1a28ad-kube-api-access-9pqnd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.305893 master-2 kubenswrapper[4762]: I1014 13:28:51.305808 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util" (OuterVolumeSpecName: "util") pod "255dc145-a476-412e-b14d-bda046fb2528" (UID: "255dc145-a476-412e-b14d-bda046fb2528"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:51.308757 master-2 kubenswrapper[4762]: I1014 13:28:51.308685 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255dc145-a476-412e-b14d-bda046fb2528-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:51.507653 master-2 kubenswrapper[4762]: I1014 13:28:51.507563 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" event={"ID":"f09ac26b-8035-43e5-9f90-667cae1a28ad","Type":"ContainerDied","Data":"80bd93938d057fbfa13478bb100eab8c0497a08709a25f02114fbe8de71359cc"} Oct 14 13:28:51.507653 master-2 kubenswrapper[4762]: I1014 13:28:51.507656 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80bd93938d057fbfa13478bb100eab8c0497a08709a25f02114fbe8de71359cc" Oct 14 13:28:51.508560 master-2 kubenswrapper[4762]: I1014 13:28:51.507590 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2ppm5m" Oct 14 13:28:51.509906 master-2 kubenswrapper[4762]: I1014 13:28:51.509835 4762 generic.go:334] "Generic (PLEG): container finished" podID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerID="cfe696cfd0c3b7877604c6fc176e543de422c63953903df4da02228e081a0fce" exitCode=0 Oct 14 13:28:51.510041 master-2 kubenswrapper[4762]: I1014 13:28:51.509954 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" event={"ID":"a26cb254-5ef8-4886-89ea-fd1e27818ef0","Type":"ContainerDied","Data":"cfe696cfd0c3b7877604c6fc176e543de422c63953903df4da02228e081a0fce"} Oct 14 13:28:51.510108 master-2 kubenswrapper[4762]: I1014 13:28:51.510055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" event={"ID":"a26cb254-5ef8-4886-89ea-fd1e27818ef0","Type":"ContainerStarted","Data":"8961a2692dd5ef8816a7053d930cfd0d0b2c2f25fe4366bd00e8e501338a9089"} Oct 14 13:28:51.513592 master-2 kubenswrapper[4762]: I1014 13:28:51.513533 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" event={"ID":"255dc145-a476-412e-b14d-bda046fb2528","Type":"ContainerDied","Data":"9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7"} Oct 14 13:28:51.513592 master-2 kubenswrapper[4762]: I1014 13:28:51.513582 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cqvvdb" Oct 14 13:28:51.513738 master-2 kubenswrapper[4762]: I1014 13:28:51.513583 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e211c22712817ff2b7188f64dcb7ba2649e58c4c5b3a34c3971fedda90f02a7" Oct 14 13:28:51.612915 master-2 kubenswrapper[4762]: I1014 13:28:51.612852 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-65499f9774-b84hw"] Oct 14 13:28:51.613184 master-2 kubenswrapper[4762]: E1014 13:28:51.613133 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="pull" Oct 14 13:28:51.613184 master-2 kubenswrapper[4762]: I1014 13:28:51.613185 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="pull" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: E1014 13:28:51.613205 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="extract" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: I1014 13:28:51.613216 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="extract" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: E1014 13:28:51.613235 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="util" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: I1014 13:28:51.613245 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="util" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: E1014 13:28:51.613260 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="extract" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: I1014 13:28:51.613270 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="extract" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: E1014 13:28:51.613284 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="pull" Oct 14 13:28:51.613296 master-2 kubenswrapper[4762]: I1014 13:28:51.613293 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="pull" Oct 14 13:28:51.613631 master-2 kubenswrapper[4762]: E1014 13:28:51.613311 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="util" Oct 14 13:28:51.613631 master-2 kubenswrapper[4762]: I1014 13:28:51.613321 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="util" Oct 14 13:28:51.613631 master-2 kubenswrapper[4762]: I1014 13:28:51.613503 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09ac26b-8035-43e5-9f90-667cae1a28ad" containerName="extract" Oct 14 13:28:51.613631 master-2 kubenswrapper[4762]: I1014 13:28:51.613525 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="255dc145-a476-412e-b14d-bda046fb2528" containerName="extract" Oct 14 13:28:51.614605 master-2 kubenswrapper[4762]: I1014 13:28:51.614559 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.622541 master-2 kubenswrapper[4762]: I1014 13:28:51.622487 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 14 13:28:51.624694 master-2 kubenswrapper[4762]: I1014 13:28:51.624648 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 14 13:28:51.624935 master-2 kubenswrapper[4762]: I1014 13:28:51.624906 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 14 13:28:51.625269 master-2 kubenswrapper[4762]: I1014 13:28:51.625240 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 14 13:28:51.625456 master-2 kubenswrapper[4762]: I1014 13:28:51.625426 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 14 13:28:51.625680 master-2 kubenswrapper[4762]: I1014 13:28:51.625652 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 14 13:28:51.626520 master-2 kubenswrapper[4762]: I1014 13:28:51.626484 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 14 13:28:51.626709 master-2 kubenswrapper[4762]: I1014 13:28:51.626678 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 14 13:28:51.626911 master-2 kubenswrapper[4762]: I1014 13:28:51.626881 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-95k8q" Oct 14 13:28:51.631246 master-2 kubenswrapper[4762]: I1014 13:28:51.631200 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 14 13:28:51.633295 master-2 kubenswrapper[4762]: I1014 13:28:51.633196 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-65499f9774-b84hw"] Oct 14 13:28:51.636174 master-2 kubenswrapper[4762]: I1014 13:28:51.636108 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 14 13:28:51.713216 master-2 kubenswrapper[4762]: I1014 13:28:51.713114 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-serving-cert\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713499 master-2 kubenswrapper[4762]: I1014 13:28:51.713234 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-trusted-ca-bundle\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713499 master-2 kubenswrapper[4762]: I1014 13:28:51.713296 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-audit-dir\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713499 master-2 kubenswrapper[4762]: I1014 13:28:51.713335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-client\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713499 master-2 kubenswrapper[4762]: I1014 13:28:51.713372 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-encryption-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713800 master-2 kubenswrapper[4762]: I1014 13:28:51.713649 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-serving-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713800 master-2 kubenswrapper[4762]: I1014 13:28:51.713718 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-node-pullsecrets\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713930 master-2 kubenswrapper[4762]: I1014 13:28:51.713876 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-image-import-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.713930 master-2 kubenswrapper[4762]: I1014 13:28:51.713924 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrksg\" (UniqueName: \"kubernetes.io/projected/c9120b19-dd4f-44d9-a928-3134ab322156-kube-api-access-qrksg\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.714060 master-2 kubenswrapper[4762]: I1014 13:28:51.713960 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-audit\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.714127 master-2 kubenswrapper[4762]: I1014 13:28:51.714052 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815085 master-2 kubenswrapper[4762]: I1014 13:28:51.815010 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-image-import-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815085 master-2 kubenswrapper[4762]: I1014 13:28:51.815076 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrksg\" (UniqueName: \"kubernetes.io/projected/c9120b19-dd4f-44d9-a928-3134ab322156-kube-api-access-qrksg\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815108 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-audit\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815198 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-serving-cert\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-trusted-ca-bundle\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815266 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-audit-dir\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815291 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-client\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-encryption-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815367 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-serving-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815392 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-node-pullsecrets\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.815459 master-2 kubenswrapper[4762]: I1014 13:28:51.815443 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-audit-dir\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.816261 master-2 kubenswrapper[4762]: I1014 13:28:51.815492 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c9120b19-dd4f-44d9-a928-3134ab322156-node-pullsecrets\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.816261 master-2 kubenswrapper[4762]: I1014 13:28:51.815915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-image-import-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.816261 master-2 kubenswrapper[4762]: I1014 13:28:51.816072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-audit\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.816562 master-2 kubenswrapper[4762]: I1014 13:28:51.816380 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-serving-ca\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.816666 master-2 kubenswrapper[4762]: I1014 13:28:51.816605 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.817142 master-2 kubenswrapper[4762]: I1014 13:28:51.817074 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9120b19-dd4f-44d9-a928-3134ab322156-trusted-ca-bundle\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.819171 master-2 kubenswrapper[4762]: I1014 13:28:51.819107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-serving-cert\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.820245 master-2 kubenswrapper[4762]: I1014 13:28:51.820181 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-etcd-client\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.820957 master-2 kubenswrapper[4762]: I1014 13:28:51.820885 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c9120b19-dd4f-44d9-a928-3134ab322156-encryption-config\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.848640 master-2 kubenswrapper[4762]: I1014 13:28:51.848587 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrksg\" (UniqueName: \"kubernetes.io/projected/c9120b19-dd4f-44d9-a928-3134ab322156-kube-api-access-qrksg\") pod \"apiserver-65499f9774-b84hw\" (UID: \"c9120b19-dd4f-44d9-a928-3134ab322156\") " pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:51.941205 master-2 kubenswrapper[4762]: I1014 13:28:51.941048 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:52.358381 master-2 kubenswrapper[4762]: I1014 13:28:52.358315 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-65499f9774-b84hw"] Oct 14 13:28:52.365178 master-2 kubenswrapper[4762]: W1014 13:28:52.365071 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9120b19_dd4f_44d9_a928_3134ab322156.slice/crio-1c486eed6f551e5ace422ba7e380460f49a49b442d51777829c34dc3b1131454 WatchSource:0}: Error finding container 1c486eed6f551e5ace422ba7e380460f49a49b442d51777829c34dc3b1131454: Status 404 returned error can't find the container with id 1c486eed6f551e5ace422ba7e380460f49a49b442d51777829c34dc3b1131454 Oct 14 13:28:52.522815 master-2 kubenswrapper[4762]: I1014 13:28:52.522718 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65499f9774-b84hw" event={"ID":"c9120b19-dd4f-44d9-a928-3134ab322156","Type":"ContainerStarted","Data":"1c486eed6f551e5ace422ba7e380460f49a49b442d51777829c34dc3b1131454"} Oct 14 13:28:53.538642 master-2 kubenswrapper[4762]: I1014 13:28:53.538546 4762 generic.go:334] "Generic (PLEG): container finished" podID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerID="44eccb333fd11344e28c736f2abde04329544f49a7bd4d0c207ee495790ef38d" exitCode=0 Oct 14 13:28:53.538642 master-2 kubenswrapper[4762]: I1014 13:28:53.538605 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" event={"ID":"a26cb254-5ef8-4886-89ea-fd1e27818ef0","Type":"ContainerDied","Data":"44eccb333fd11344e28c736f2abde04329544f49a7bd4d0c207ee495790ef38d"} Oct 14 13:28:53.541121 master-2 kubenswrapper[4762]: I1014 13:28:53.541050 4762 generic.go:334] "Generic (PLEG): container finished" podID="c9120b19-dd4f-44d9-a928-3134ab322156" containerID="0c4cdea1410f477b124c014df5f19b32342f7985ecbec14eb9e0d421248c71ae" exitCode=0 Oct 14 13:28:53.541296 master-2 kubenswrapper[4762]: I1014 13:28:53.541121 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65499f9774-b84hw" event={"ID":"c9120b19-dd4f-44d9-a928-3134ab322156","Type":"ContainerDied","Data":"0c4cdea1410f477b124c014df5f19b32342f7985ecbec14eb9e0d421248c71ae"} Oct 14 13:28:54.551094 master-2 kubenswrapper[4762]: I1014 13:28:54.550988 4762 generic.go:334] "Generic (PLEG): container finished" podID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerID="ca1e7c71f518165ef8bdee96778fa1ca0092fed872d672df41e50243eb84b83d" exitCode=0 Oct 14 13:28:54.551727 master-2 kubenswrapper[4762]: I1014 13:28:54.551119 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" event={"ID":"a26cb254-5ef8-4886-89ea-fd1e27818ef0","Type":"ContainerDied","Data":"ca1e7c71f518165ef8bdee96778fa1ca0092fed872d672df41e50243eb84b83d"} Oct 14 13:28:54.554086 master-2 kubenswrapper[4762]: I1014 13:28:54.554021 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65499f9774-b84hw" event={"ID":"c9120b19-dd4f-44d9-a928-3134ab322156","Type":"ContainerStarted","Data":"0690a7c47e85a6fa108bf7bf91d77ec7cf1339643a90e6b2519abfe02faa8c3a"} Oct 14 13:28:54.554086 master-2 kubenswrapper[4762]: I1014 13:28:54.554086 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65499f9774-b84hw" event={"ID":"c9120b19-dd4f-44d9-a928-3134ab322156","Type":"ContainerStarted","Data":"a58fdcc69b34525f8ee968b97cb9dec5c12dbf0caa7eaf0e388b3fd7caae500d"} Oct 14 13:28:54.630429 master-2 kubenswrapper[4762]: I1014 13:28:54.630345 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-65499f9774-b84hw" podStartSLOduration=120.630329849 podStartE2EDuration="2m0.630329849s" podCreationTimestamp="2025-10-14 13:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:28:54.627889062 +0000 UTC m=+1363.872048221" watchObservedRunningTime="2025-10-14 13:28:54.630329849 +0000 UTC m=+1363.874489008" Oct 14 13:28:55.869935 master-2 kubenswrapper[4762]: I1014 13:28:55.869605 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:55.980101 master-2 kubenswrapper[4762]: I1014 13:28:55.980015 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle\") pod \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " Oct 14 13:28:55.980101 master-2 kubenswrapper[4762]: I1014 13:28:55.980092 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64n8b\" (UniqueName: \"kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b\") pod \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " Oct 14 13:28:55.980651 master-2 kubenswrapper[4762]: I1014 13:28:55.980170 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util\") pod \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\" (UID: \"a26cb254-5ef8-4886-89ea-fd1e27818ef0\") " Oct 14 13:28:55.995465 master-2 kubenswrapper[4762]: I1014 13:28:55.985313 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle" (OuterVolumeSpecName: "bundle") pod "a26cb254-5ef8-4886-89ea-fd1e27818ef0" (UID: "a26cb254-5ef8-4886-89ea-fd1e27818ef0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:55.995811 master-2 kubenswrapper[4762]: I1014 13:28:55.995548 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util" (OuterVolumeSpecName: "util") pod "a26cb254-5ef8-4886-89ea-fd1e27818ef0" (UID: "a26cb254-5ef8-4886-89ea-fd1e27818ef0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:28:55.996343 master-2 kubenswrapper[4762]: I1014 13:28:55.996286 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b" (OuterVolumeSpecName: "kube-api-access-64n8b") pod "a26cb254-5ef8-4886-89ea-fd1e27818ef0" (UID: "a26cb254-5ef8-4886-89ea-fd1e27818ef0"). InnerVolumeSpecName "kube-api-access-64n8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:28:56.081236 master-2 kubenswrapper[4762]: I1014 13:28:56.081111 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:56.081236 master-2 kubenswrapper[4762]: I1014 13:28:56.081187 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64n8b\" (UniqueName: \"kubernetes.io/projected/a26cb254-5ef8-4886-89ea-fd1e27818ef0-kube-api-access-64n8b\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:56.081236 master-2 kubenswrapper[4762]: I1014 13:28:56.081198 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a26cb254-5ef8-4886-89ea-fd1e27818ef0-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:28:56.568972 master-2 kubenswrapper[4762]: I1014 13:28:56.568883 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" event={"ID":"a26cb254-5ef8-4886-89ea-fd1e27818ef0","Type":"ContainerDied","Data":"8961a2692dd5ef8816a7053d930cfd0d0b2c2f25fe4366bd00e8e501338a9089"} Oct 14 13:28:56.568972 master-2 kubenswrapper[4762]: I1014 13:28:56.568934 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d4wlwn" Oct 14 13:28:56.569404 master-2 kubenswrapper[4762]: I1014 13:28:56.568940 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8961a2692dd5ef8816a7053d930cfd0d0b2c2f25fe4366bd00e8e501338a9089" Oct 14 13:28:56.941692 master-2 kubenswrapper[4762]: I1014 13:28:56.941597 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:56.941692 master-2 kubenswrapper[4762]: I1014 13:28:56.941686 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:56.953476 master-2 kubenswrapper[4762]: I1014 13:28:56.953396 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:28:57.582348 master-2 kubenswrapper[4762]: I1014 13:28:57.582277 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-65499f9774-b84hw" Oct 14 13:29:12.866625 master-2 kubenswrapper[4762]: I1014 13:29:12.866549 4762 scope.go:117] "RemoveContainer" containerID="1a5565e2e0f073ac96e5a71342aa7bc9343588f6835cc15141df600320b35e4b" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: I1014 13:30:03.626674 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2pxml"] Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: E1014 13:30:03.626889 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="extract" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: I1014 13:30:03.626900 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="extract" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: E1014 13:30:03.626926 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="pull" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: I1014 13:30:03.626932 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="pull" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: E1014 13:30:03.626945 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="util" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: I1014 13:30:03.626951 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="util" Oct 14 13:30:03.629112 master-2 kubenswrapper[4762]: I1014 13:30:03.627038 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26cb254-5ef8-4886-89ea-fd1e27818ef0" containerName="extract" Oct 14 13:30:03.633322 master-2 kubenswrapper[4762]: I1014 13:30:03.633282 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.636254 master-2 kubenswrapper[4762]: I1014 13:30:03.635935 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 14 13:30:03.636401 master-2 kubenswrapper[4762]: I1014 13:30:03.636370 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 14 13:30:03.636557 master-2 kubenswrapper[4762]: I1014 13:30:03.636538 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 14 13:30:03.638055 master-2 kubenswrapper[4762]: I1014 13:30:03.637016 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.661910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-reloader\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.661970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-conf\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.662003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-startup\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.662211 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics-certs\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.662273 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7t7\" (UniqueName: \"kubernetes.io/projected/3f514207-4fde-4312-bc50-75fda0edcdfe-kube-api-access-gf7t7\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.662381 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-sockets\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.663182 master-2 kubenswrapper[4762]: I1014 13:30:03.662435 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.757405 master-2 kubenswrapper[4762]: I1014 13:30:03.757337 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kp26f"] Oct 14 13:30:03.758709 master-2 kubenswrapper[4762]: I1014 13:30:03.758620 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.761461 master-2 kubenswrapper[4762]: I1014 13:30:03.761415 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 14 13:30:03.762079 master-2 kubenswrapper[4762]: I1014 13:30:03.762052 4762 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 14 13:30:03.762404 master-2 kubenswrapper[4762]: I1014 13:30:03.762380 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 14 13:30:03.763775 master-2 kubenswrapper[4762]: I1014 13:30:03.763698 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7t7\" (UniqueName: \"kubernetes.io/projected/3f514207-4fde-4312-bc50-75fda0edcdfe-kube-api-access-gf7t7\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.763865 master-2 kubenswrapper[4762]: I1014 13:30:03.763782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-sockets\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.763865 master-2 kubenswrapper[4762]: I1014 13:30:03.763820 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.763951 master-2 kubenswrapper[4762]: I1014 13:30:03.763891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-reloader\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.763951 master-2 kubenswrapper[4762]: I1014 13:30:03.763915 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-conf\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.763951 master-2 kubenswrapper[4762]: I1014 13:30:03.763944 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-startup\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.764080 master-2 kubenswrapper[4762]: I1014 13:30:03.763997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics-certs\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.764574 master-2 kubenswrapper[4762]: I1014 13:30:03.764528 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-reloader\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.764646 master-2 kubenswrapper[4762]: I1014 13:30:03.764583 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-conf\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.764827 master-2 kubenswrapper[4762]: I1014 13:30:03.764718 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.764827 master-2 kubenswrapper[4762]: I1014 13:30:03.764721 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-sockets\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.766489 master-2 kubenswrapper[4762]: I1014 13:30:03.766450 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3f514207-4fde-4312-bc50-75fda0edcdfe-frr-startup\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.771064 master-2 kubenswrapper[4762]: I1014 13:30:03.769907 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f514207-4fde-4312-bc50-75fda0edcdfe-metrics-certs\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.804459 master-2 kubenswrapper[4762]: I1014 13:30:03.804360 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7t7\" (UniqueName: \"kubernetes.io/projected/3f514207-4fde-4312-bc50-75fda0edcdfe-kube-api-access-gf7t7\") pod \"frr-k8s-2pxml\" (UID: \"3f514207-4fde-4312-bc50-75fda0edcdfe\") " pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.865805 master-2 kubenswrapper[4762]: I1014 13:30:03.865740 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49013d18-7b86-4d86-ac2a-54a004e15932-metallb-excludel2\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.865805 master-2 kubenswrapper[4762]: I1014 13:30:03.865795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.866267 master-2 kubenswrapper[4762]: I1014 13:30:03.865847 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-metrics-certs\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.866267 master-2 kubenswrapper[4762]: I1014 13:30:03.865871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtnnp\" (UniqueName: \"kubernetes.io/projected/49013d18-7b86-4d86-ac2a-54a004e15932-kube-api-access-jtnnp\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.948171 master-2 kubenswrapper[4762]: I1014 13:30:03.948057 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:03.966849 master-2 kubenswrapper[4762]: I1014 13:30:03.966796 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-metrics-certs\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.966974 master-2 kubenswrapper[4762]: I1014 13:30:03.966863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtnnp\" (UniqueName: \"kubernetes.io/projected/49013d18-7b86-4d86-ac2a-54a004e15932-kube-api-access-jtnnp\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.966974 master-2 kubenswrapper[4762]: I1014 13:30:03.966936 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49013d18-7b86-4d86-ac2a-54a004e15932-metallb-excludel2\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.966974 master-2 kubenswrapper[4762]: I1014 13:30:03.966958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.967174 master-2 kubenswrapper[4762]: E1014 13:30:03.967127 4762 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 14 13:30:03.967240 master-2 kubenswrapper[4762]: E1014 13:30:03.967228 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist podName:49013d18-7b86-4d86-ac2a-54a004e15932 nodeName:}" failed. No retries permitted until 2025-10-14 13:30:04.467207564 +0000 UTC m=+1433.711366733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist") pod "speaker-kp26f" (UID: "49013d18-7b86-4d86-ac2a-54a004e15932") : secret "metallb-memberlist" not found Oct 14 13:30:03.968310 master-2 kubenswrapper[4762]: I1014 13:30:03.968266 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49013d18-7b86-4d86-ac2a-54a004e15932-metallb-excludel2\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.971578 master-2 kubenswrapper[4762]: I1014 13:30:03.971545 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-metrics-certs\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:03.988225 master-2 kubenswrapper[4762]: I1014 13:30:03.988184 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtnnp\" (UniqueName: \"kubernetes.io/projected/49013d18-7b86-4d86-ac2a-54a004e15932-kube-api-access-jtnnp\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:04.473073 master-2 kubenswrapper[4762]: I1014 13:30:04.472991 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:04.473362 master-2 kubenswrapper[4762]: E1014 13:30:04.473245 4762 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 14 13:30:04.473416 master-2 kubenswrapper[4762]: E1014 13:30:04.473369 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist podName:49013d18-7b86-4d86-ac2a-54a004e15932 nodeName:}" failed. No retries permitted until 2025-10-14 13:30:05.473342146 +0000 UTC m=+1434.717501345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist") pod "speaker-kp26f" (UID: "49013d18-7b86-4d86-ac2a-54a004e15932") : secret "metallb-memberlist" not found Oct 14 13:30:05.094745 master-2 kubenswrapper[4762]: I1014 13:30:05.094643 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"f4b80548579b60c47719133707f35844fe8c4997c8b45c64e3507bf44a19be8c"} Oct 14 13:30:05.487674 master-2 kubenswrapper[4762]: I1014 13:30:05.487557 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:05.491926 master-2 kubenswrapper[4762]: I1014 13:30:05.491643 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49013d18-7b86-4d86-ac2a-54a004e15932-memberlist\") pod \"speaker-kp26f\" (UID: \"49013d18-7b86-4d86-ac2a-54a004e15932\") " pod="metallb-system/speaker-kp26f" Oct 14 13:30:05.594674 master-2 kubenswrapper[4762]: I1014 13:30:05.594600 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kp26f" Oct 14 13:30:05.624495 master-2 kubenswrapper[4762]: W1014 13:30:05.624409 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49013d18_7b86_4d86_ac2a_54a004e15932.slice/crio-8f3922a9fc67b6d767a35483112dd3e60a3896f958ca701d2239a014d087c4bb WatchSource:0}: Error finding container 8f3922a9fc67b6d767a35483112dd3e60a3896f958ca701d2239a014d087c4bb: Status 404 returned error can't find the container with id 8f3922a9fc67b6d767a35483112dd3e60a3896f958ca701d2239a014d087c4bb Oct 14 13:30:06.102719 master-2 kubenswrapper[4762]: I1014 13:30:06.102659 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kp26f" event={"ID":"49013d18-7b86-4d86-ac2a-54a004e15932","Type":"ContainerStarted","Data":"8f3922a9fc67b6d767a35483112dd3e60a3896f958ca701d2239a014d087c4bb"} Oct 14 13:30:08.634300 master-2 kubenswrapper[4762]: I1014 13:30:08.633619 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-g87gn"] Oct 14 13:30:08.635189 master-2 kubenswrapper[4762]: I1014 13:30:08.634771 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.641344 master-2 kubenswrapper[4762]: I1014 13:30:08.641310 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 14 13:30:08.641344 master-2 kubenswrapper[4762]: I1014 13:30:08.641331 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 14 13:30:08.736178 master-2 kubenswrapper[4762]: I1014 13:30:08.736106 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-dbus-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.736459 master-2 kubenswrapper[4762]: I1014 13:30:08.736207 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-ovs-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.736459 master-2 kubenswrapper[4762]: I1014 13:30:08.736245 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsb4h\" (UniqueName: \"kubernetes.io/projected/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-kube-api-access-hsb4h\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.736459 master-2 kubenswrapper[4762]: I1014 13:30:08.736306 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-nmstate-lock\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.837482 master-2 kubenswrapper[4762]: I1014 13:30:08.837351 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-nmstate-lock\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.837482 master-2 kubenswrapper[4762]: I1014 13:30:08.837431 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-dbus-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.837482 master-2 kubenswrapper[4762]: I1014 13:30:08.837459 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-nmstate-lock\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.837482 master-2 kubenswrapper[4762]: I1014 13:30:08.837479 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-ovs-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.838009 master-2 kubenswrapper[4762]: I1014 13:30:08.837510 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-ovs-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.838009 master-2 kubenswrapper[4762]: I1014 13:30:08.837529 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsb4h\" (UniqueName: \"kubernetes.io/projected/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-kube-api-access-hsb4h\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.838009 master-2 kubenswrapper[4762]: I1014 13:30:08.837879 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-dbus-socket\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.864963 master-2 kubenswrapper[4762]: I1014 13:30:08.864911 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsb4h\" (UniqueName: \"kubernetes.io/projected/b4c2af67-dbd9-4cd1-8214-2579c1851f1e-kube-api-access-hsb4h\") pod \"nmstate-handler-g87gn\" (UID: \"b4c2af67-dbd9-4cd1-8214-2579c1851f1e\") " pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:08.956243 master-2 kubenswrapper[4762]: I1014 13:30:08.956107 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:11.403353 master-2 kubenswrapper[4762]: W1014 13:30:11.403293 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c2af67_dbd9_4cd1_8214_2579c1851f1e.slice/crio-fba7a7ccab3c2c87f2a0f12c8d3a8ff8f7b0f3a322c7ea5a349656dc8c4c9c18 WatchSource:0}: Error finding container fba7a7ccab3c2c87f2a0f12c8d3a8ff8f7b0f3a322c7ea5a349656dc8c4c9c18: Status 404 returned error can't find the container with id fba7a7ccab3c2c87f2a0f12c8d3a8ff8f7b0f3a322c7ea5a349656dc8c4c9c18 Oct 14 13:30:12.145652 master-2 kubenswrapper[4762]: I1014 13:30:12.145539 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-g87gn" event={"ID":"b4c2af67-dbd9-4cd1-8214-2579c1851f1e","Type":"ContainerStarted","Data":"fba7a7ccab3c2c87f2a0f12c8d3a8ff8f7b0f3a322c7ea5a349656dc8c4c9c18"} Oct 14 13:30:12.148011 master-2 kubenswrapper[4762]: I1014 13:30:12.147903 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kp26f" event={"ID":"49013d18-7b86-4d86-ac2a-54a004e15932","Type":"ContainerStarted","Data":"1b1c93338ccb331c0e7efe66d39663196b8daaff8465cc0b8114ff77e192ba44"} Oct 14 13:30:12.150336 master-2 kubenswrapper[4762]: I1014 13:30:12.150219 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f514207-4fde-4312-bc50-75fda0edcdfe" containerID="131ab3c1db8469709002482cf9b9694d358329b645853ad757a0354fb66ce7dc" exitCode=0 Oct 14 13:30:12.150336 master-2 kubenswrapper[4762]: I1014 13:30:12.150277 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerDied","Data":"131ab3c1db8469709002482cf9b9694d358329b645853ad757a0354fb66ce7dc"} Oct 14 13:30:13.163714 master-2 kubenswrapper[4762]: I1014 13:30:13.163478 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kp26f" event={"ID":"49013d18-7b86-4d86-ac2a-54a004e15932","Type":"ContainerStarted","Data":"e748dcaea58a246b70112dd95bb4ea52735e55f039dd9387433b255170fb2362"} Oct 14 13:30:13.166731 master-2 kubenswrapper[4762]: I1014 13:30:13.164951 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kp26f" Oct 14 13:30:13.172104 master-2 kubenswrapper[4762]: I1014 13:30:13.172016 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f514207-4fde-4312-bc50-75fda0edcdfe" containerID="4f7c1dad3f2bd7bcfb8304c94bb48d1336a03321293ebf067108c2795f22190c" exitCode=0 Oct 14 13:30:13.172104 master-2 kubenswrapper[4762]: I1014 13:30:13.172090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerDied","Data":"4f7c1dad3f2bd7bcfb8304c94bb48d1336a03321293ebf067108c2795f22190c"} Oct 14 13:30:13.352667 master-2 kubenswrapper[4762]: I1014 13:30:13.352562 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kp26f" podStartSLOduration=3.240831497 podStartE2EDuration="10.352535488s" podCreationTimestamp="2025-10-14 13:30:03 +0000 UTC" firstStartedPulling="2025-10-14 13:30:05.626774765 +0000 UTC m=+1434.870933964" lastFinishedPulling="2025-10-14 13:30:12.738478796 +0000 UTC m=+1441.982637955" observedRunningTime="2025-10-14 13:30:13.300920652 +0000 UTC m=+1442.545079881" watchObservedRunningTime="2025-10-14 13:30:13.352535488 +0000 UTC m=+1442.596694677" Oct 14 13:30:14.183178 master-2 kubenswrapper[4762]: I1014 13:30:14.183107 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f514207-4fde-4312-bc50-75fda0edcdfe" containerID="5a97686c7d2fa78a05d2f3c9daf11a0216d54490f02c9a282252fcf771f52c2f" exitCode=0 Oct 14 13:30:14.183966 master-2 kubenswrapper[4762]: I1014 13:30:14.183231 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerDied","Data":"5a97686c7d2fa78a05d2f3c9daf11a0216d54490f02c9a282252fcf771f52c2f"} Oct 14 13:30:15.193995 master-2 kubenswrapper[4762]: I1014 13:30:15.193951 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-g87gn" event={"ID":"b4c2af67-dbd9-4cd1-8214-2579c1851f1e","Type":"ContainerStarted","Data":"25eb91c6b7de5d40bdb5aae28aa62418cd5d06627fa2227dcf3812021ffcb1a0"} Oct 14 13:30:15.194465 master-2 kubenswrapper[4762]: I1014 13:30:15.194106 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:15.197744 master-2 kubenswrapper[4762]: I1014 13:30:15.197708 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"e1ec5afa1be994ebde5374f4471e03e49a69c0ed41ee1fb8a9c6780edd98f061"} Oct 14 13:30:15.197823 master-2 kubenswrapper[4762]: I1014 13:30:15.197752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"0bcbd34f50b50536b62cebdbf4d15b7996964b60ce240f4fe6346bb57e3897e6"} Oct 14 13:30:15.197823 master-2 kubenswrapper[4762]: I1014 13:30:15.197767 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"3b43bd64c6156b5443a9b91f4c8a28104b80c37e7af8d1bc8c9edfb7f8072e33"} Oct 14 13:30:15.197823 master-2 kubenswrapper[4762]: I1014 13:30:15.197779 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"246697e51fea43ccc3f93e278ef0b6b43b27235e7c5a96e1d52df0fa2e92ea55"} Oct 14 13:30:15.221762 master-2 kubenswrapper[4762]: I1014 13:30:15.221699 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-g87gn" podStartSLOduration=4.355113494 podStartE2EDuration="7.221683982s" podCreationTimestamp="2025-10-14 13:30:08 +0000 UTC" firstStartedPulling="2025-10-14 13:30:11.406477097 +0000 UTC m=+1440.650636256" lastFinishedPulling="2025-10-14 13:30:14.273047585 +0000 UTC m=+1443.517206744" observedRunningTime="2025-10-14 13:30:15.21529529 +0000 UTC m=+1444.459454529" watchObservedRunningTime="2025-10-14 13:30:15.221683982 +0000 UTC m=+1444.465843141" Oct 14 13:30:16.215816 master-2 kubenswrapper[4762]: I1014 13:30:16.215699 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"d752239c703c0d780e29032af66255e13b2887d3993ab91a3c54c7bdafe5f85e"} Oct 14 13:30:16.216903 master-2 kubenswrapper[4762]: I1014 13:30:16.215839 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:16.216903 master-2 kubenswrapper[4762]: I1014 13:30:16.215876 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2pxml" event={"ID":"3f514207-4fde-4312-bc50-75fda0edcdfe","Type":"ContainerStarted","Data":"dfafbb4507a3c75b3315f9010c8f3da49132388bac8f2ca88b0aaf3da8028ce2"} Oct 14 13:30:16.262978 master-2 kubenswrapper[4762]: I1014 13:30:16.262859 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2pxml" podStartSLOduration=5.913721297 podStartE2EDuration="13.262831062s" podCreationTimestamp="2025-10-14 13:30:03 +0000 UTC" firstStartedPulling="2025-10-14 13:30:04.086361951 +0000 UTC m=+1433.330521140" lastFinishedPulling="2025-10-14 13:30:11.435471746 +0000 UTC m=+1440.679630905" observedRunningTime="2025-10-14 13:30:16.257686309 +0000 UTC m=+1445.501845558" watchObservedRunningTime="2025-10-14 13:30:16.262831062 +0000 UTC m=+1445.506990261" Oct 14 13:30:18.952694 master-2 kubenswrapper[4762]: I1014 13:30:18.952589 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:19.014877 master-2 kubenswrapper[4762]: I1014 13:30:19.014760 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:20.009142 master-2 kubenswrapper[4762]: I1014 13:30:20.009028 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:30:23.981770 master-2 kubenswrapper[4762]: I1014 13:30:23.981676 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-g87gn" Oct 14 13:30:25.600902 master-2 kubenswrapper[4762]: I1014 13:30:25.600808 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kp26f" Oct 14 13:30:33.957225 master-2 kubenswrapper[4762]: I1014 13:30:33.955868 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2pxml" Oct 14 13:30:36.572334 master-2 kubenswrapper[4762]: I1014 13:30:36.572255 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-zvnk6"] Oct 14 13:30:36.573263 master-2 kubenswrapper[4762]: I1014 13:30:36.573222 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.577002 master-2 kubenswrapper[4762]: I1014 13:30:36.576946 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Oct 14 13:30:36.577144 master-2 kubenswrapper[4762]: I1014 13:30:36.577088 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Oct 14 13:30:36.577346 master-2 kubenswrapper[4762]: I1014 13:30:36.577296 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Oct 14 13:30:36.690325 master-2 kubenswrapper[4762]: I1014 13:30:36.690248 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-zvnk6"] Oct 14 13:30:36.712922 master-2 kubenswrapper[4762]: I1014 13:30:36.712828 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-registration-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.712922 master-2 kubenswrapper[4762]: I1014 13:30:36.712918 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-device-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713028 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5008117-98eb-4d13-aa1c-e832d3a09c0b-metrics-cert\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713074 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-run-udev\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713101 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-lvmd-config\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713193 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls5pc\" (UniqueName: \"kubernetes.io/projected/f5008117-98eb-4d13-aa1c-e832d3a09c0b-kube-api-access-ls5pc\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713223 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-pod-volumes-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713248 master-2 kubenswrapper[4762]: I1014 13:30:36.713250 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-file-lock-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713503 master-2 kubenswrapper[4762]: I1014 13:30:36.713281 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-sys\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713503 master-2 kubenswrapper[4762]: I1014 13:30:36.713308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-csi-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.713503 master-2 kubenswrapper[4762]: I1014 13:30:36.713334 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-node-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.814749 master-2 kubenswrapper[4762]: I1014 13:30:36.814670 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-file-lock-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.814749 master-2 kubenswrapper[4762]: I1014 13:30:36.814723 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-sys\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.814749 master-2 kubenswrapper[4762]: I1014 13:30:36.814747 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-csi-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.814749 master-2 kubenswrapper[4762]: I1014 13:30:36.814764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-node-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814789 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-registration-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-device-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814848 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5008117-98eb-4d13-aa1c-e832d3a09c0b-metrics-cert\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-run-udev\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-lvmd-config\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814919 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls5pc\" (UniqueName: \"kubernetes.io/projected/f5008117-98eb-4d13-aa1c-e832d3a09c0b-kube-api-access-ls5pc\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814937 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-pod-volumes-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.814958 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-sys\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815002 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-device-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815089 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-pod-volumes-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815117 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-run-udev\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815368 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-lvmd-config\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815399 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-registration-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815445 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-file-lock-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.816403 master-2 kubenswrapper[4762]: I1014 13:30:36.815539 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-node-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.817628 master-2 kubenswrapper[4762]: I1014 13:30:36.817600 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/f5008117-98eb-4d13-aa1c-e832d3a09c0b-csi-plugin-dir\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.819288 master-2 kubenswrapper[4762]: I1014 13:30:36.819225 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5008117-98eb-4d13-aa1c-e832d3a09c0b-metrics-cert\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.849912 master-2 kubenswrapper[4762]: I1014 13:30:36.849404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls5pc\" (UniqueName: \"kubernetes.io/projected/f5008117-98eb-4d13-aa1c-e832d3a09c0b-kube-api-access-ls5pc\") pod \"vg-manager-zvnk6\" (UID: \"f5008117-98eb-4d13-aa1c-e832d3a09c0b\") " pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:36.898854 master-2 kubenswrapper[4762]: I1014 13:30:36.898779 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:37.392966 master-2 kubenswrapper[4762]: I1014 13:30:37.392841 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-zvnk6"] Oct 14 13:30:37.394735 master-2 kubenswrapper[4762]: W1014 13:30:37.394642 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5008117_98eb_4d13_aa1c_e832d3a09c0b.slice/crio-ab0e7d933f3049eb07a72b4e17ccdc415f9f7114ca11a7314ff4145a52777476 WatchSource:0}: Error finding container ab0e7d933f3049eb07a72b4e17ccdc415f9f7114ca11a7314ff4145a52777476: Status 404 returned error can't find the container with id ab0e7d933f3049eb07a72b4e17ccdc415f9f7114ca11a7314ff4145a52777476 Oct 14 13:30:38.405817 master-2 kubenswrapper[4762]: I1014 13:30:38.405720 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-zvnk6" event={"ID":"f5008117-98eb-4d13-aa1c-e832d3a09c0b","Type":"ContainerStarted","Data":"ab0e7d933f3049eb07a72b4e17ccdc415f9f7114ca11a7314ff4145a52777476"} Oct 14 13:30:43.453445 master-2 kubenswrapper[4762]: I1014 13:30:43.453354 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-zvnk6" event={"ID":"f5008117-98eb-4d13-aa1c-e832d3a09c0b","Type":"ContainerStarted","Data":"65076ab4be265a7d2c4b13fe4345806ff687e8d5d886478e3520b514bc89cfa7"} Oct 14 13:30:43.487756 master-2 kubenswrapper[4762]: I1014 13:30:43.487611 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-zvnk6" podStartSLOduration=1.8714138070000002 podStartE2EDuration="7.487147743s" podCreationTimestamp="2025-10-14 13:30:36 +0000 UTC" firstStartedPulling="2025-10-14 13:30:37.397784806 +0000 UTC m=+1466.641943995" lastFinishedPulling="2025-10-14 13:30:43.013518762 +0000 UTC m=+1472.257677931" observedRunningTime="2025-10-14 13:30:43.482891188 +0000 UTC m=+1472.727050387" watchObservedRunningTime="2025-10-14 13:30:43.487147743 +0000 UTC m=+1472.731306932" Oct 14 13:30:45.046763 master-2 kubenswrapper[4762]: I1014 13:30:45.046601 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-564c479f-7bglk" podUID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" containerName="console" containerID="cri-o://398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9" gracePeriod=15 Oct 14 13:30:45.458561 master-2 kubenswrapper[4762]: I1014 13:30:45.458283 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-564c479f-7bglk_9fd6eab3-bc4f-437c-ab20-8db15e2ec157/console/0.log" Oct 14 13:30:45.458561 master-2 kubenswrapper[4762]: I1014 13:30:45.458367 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:30:45.478956 master-2 kubenswrapper[4762]: I1014 13:30:45.478904 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-zvnk6_f5008117-98eb-4d13-aa1c-e832d3a09c0b/vg-manager/0.log" Oct 14 13:30:45.479226 master-2 kubenswrapper[4762]: I1014 13:30:45.478990 4762 generic.go:334] "Generic (PLEG): container finished" podID="f5008117-98eb-4d13-aa1c-e832d3a09c0b" containerID="65076ab4be265a7d2c4b13fe4345806ff687e8d5d886478e3520b514bc89cfa7" exitCode=1 Oct 14 13:30:45.479226 master-2 kubenswrapper[4762]: I1014 13:30:45.479058 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-zvnk6" event={"ID":"f5008117-98eb-4d13-aa1c-e832d3a09c0b","Type":"ContainerDied","Data":"65076ab4be265a7d2c4b13fe4345806ff687e8d5d886478e3520b514bc89cfa7"} Oct 14 13:30:45.479926 master-2 kubenswrapper[4762]: I1014 13:30:45.479864 4762 scope.go:117] "RemoveContainer" containerID="65076ab4be265a7d2c4b13fe4345806ff687e8d5d886478e3520b514bc89cfa7" Oct 14 13:30:45.482325 master-2 kubenswrapper[4762]: I1014 13:30:45.482294 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-564c479f-7bglk_9fd6eab3-bc4f-437c-ab20-8db15e2ec157/console/0.log" Oct 14 13:30:45.482618 master-2 kubenswrapper[4762]: I1014 13:30:45.482331 4762 generic.go:334] "Generic (PLEG): container finished" podID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" containerID="398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9" exitCode=2 Oct 14 13:30:45.482618 master-2 kubenswrapper[4762]: I1014 13:30:45.482353 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-564c479f-7bglk" event={"ID":"9fd6eab3-bc4f-437c-ab20-8db15e2ec157","Type":"ContainerDied","Data":"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9"} Oct 14 13:30:45.482618 master-2 kubenswrapper[4762]: I1014 13:30:45.482380 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-564c479f-7bglk" event={"ID":"9fd6eab3-bc4f-437c-ab20-8db15e2ec157","Type":"ContainerDied","Data":"641c438d98ddbe736332955ba68acf2c69e0edf0cb6be1aaf160c54a6e90e65e"} Oct 14 13:30:45.482618 master-2 kubenswrapper[4762]: I1014 13:30:45.482396 4762 scope.go:117] "RemoveContainer" containerID="398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9" Oct 14 13:30:45.482618 master-2 kubenswrapper[4762]: I1014 13:30:45.482421 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-564c479f-7bglk" Oct 14 13:30:45.524369 master-2 kubenswrapper[4762]: I1014 13:30:45.524282 4762 scope.go:117] "RemoveContainer" containerID="398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9" Oct 14 13:30:45.525092 master-2 kubenswrapper[4762]: E1014 13:30:45.525047 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9\": container with ID starting with 398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9 not found: ID does not exist" containerID="398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9" Oct 14 13:30:45.525246 master-2 kubenswrapper[4762]: I1014 13:30:45.525213 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9"} err="failed to get container status \"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9\": rpc error: code = NotFound desc = could not find container \"398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9\": container with ID starting with 398c6089491af1099fcbbae03fb7e441a6ad3c7a03213abbb48b07dd3ca883b9 not found: ID does not exist" Oct 14 13:30:45.576232 master-2 kubenswrapper[4762]: I1014 13:30:45.576094 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.576232 master-2 kubenswrapper[4762]: I1014 13:30:45.576169 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.576464 master-2 kubenswrapper[4762]: I1014 13:30:45.576263 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.576464 master-2 kubenswrapper[4762]: I1014 13:30:45.576282 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcwq2\" (UniqueName: \"kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.576932 master-2 kubenswrapper[4762]: I1014 13:30:45.576895 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.577001 master-2 kubenswrapper[4762]: I1014 13:30:45.576940 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.577001 master-2 kubenswrapper[4762]: I1014 13:30:45.576963 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert\") pod \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\" (UID: \"9fd6eab3-bc4f-437c-ab20-8db15e2ec157\") " Oct 14 13:30:45.577093 master-2 kubenswrapper[4762]: I1014 13:30:45.577031 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:30:45.577410 master-2 kubenswrapper[4762]: I1014 13:30:45.577367 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:30:45.577483 master-2 kubenswrapper[4762]: I1014 13:30:45.577409 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca" (OuterVolumeSpecName: "service-ca") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:30:45.577559 master-2 kubenswrapper[4762]: I1014 13:30:45.577532 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config" (OuterVolumeSpecName: "console-config") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:30:45.577794 master-2 kubenswrapper[4762]: I1014 13:30:45.577771 4762 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-oauth-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.577898 master-2 kubenswrapper[4762]: I1014 13:30:45.577883 4762 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.577984 master-2 kubenswrapper[4762]: I1014 13:30:45.577970 4762 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-trusted-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.578078 master-2 kubenswrapper[4762]: I1014 13:30:45.578063 4762 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-service-ca\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.579346 master-2 kubenswrapper[4762]: I1014 13:30:45.579210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:30:45.579683 master-2 kubenswrapper[4762]: I1014 13:30:45.579628 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:30:45.579855 master-2 kubenswrapper[4762]: I1014 13:30:45.579815 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2" (OuterVolumeSpecName: "kube-api-access-dcwq2") pod "9fd6eab3-bc4f-437c-ab20-8db15e2ec157" (UID: "9fd6eab3-bc4f-437c-ab20-8db15e2ec157"). InnerVolumeSpecName "kube-api-access-dcwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:30:45.678984 master-2 kubenswrapper[4762]: I1014 13:30:45.678936 4762 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-serving-cert\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.678984 master-2 kubenswrapper[4762]: I1014 13:30:45.678972 4762 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-console-oauth-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.678984 master-2 kubenswrapper[4762]: I1014 13:30:45.678983 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcwq2\" (UniqueName: \"kubernetes.io/projected/9fd6eab3-bc4f-437c-ab20-8db15e2ec157-kube-api-access-dcwq2\") on node \"master-2\" DevicePath \"\"" Oct 14 13:30:45.802809 master-2 kubenswrapper[4762]: I1014 13:30:45.802750 4762 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Oct 14 13:30:45.848806 master-2 kubenswrapper[4762]: I1014 13:30:45.848674 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:30:45.856924 master-2 kubenswrapper[4762]: I1014 13:30:45.856873 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-564c479f-7bglk"] Oct 14 13:30:46.479798 master-2 kubenswrapper[4762]: I1014 13:30:46.479671 4762 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-10-14T13:30:45.802790959Z","Handler":null,"Name":""} Oct 14 13:30:46.484034 master-2 kubenswrapper[4762]: I1014 13:30:46.484016 4762 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Oct 14 13:30:46.484239 master-2 kubenswrapper[4762]: I1014 13:30:46.484224 4762 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Oct 14 13:30:46.494693 master-2 kubenswrapper[4762]: I1014 13:30:46.494137 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-zvnk6_f5008117-98eb-4d13-aa1c-e832d3a09c0b/vg-manager/0.log" Oct 14 13:30:46.494693 master-2 kubenswrapper[4762]: I1014 13:30:46.494264 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-zvnk6" event={"ID":"f5008117-98eb-4d13-aa1c-e832d3a09c0b","Type":"ContainerStarted","Data":"97ee2968a141d0b2d7676cc0accdd15e99ad9eaa37e50aef20f9b548a7c2466b"} Oct 14 13:30:46.899055 master-2 kubenswrapper[4762]: I1014 13:30:46.898899 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:47.556343 master-2 kubenswrapper[4762]: I1014 13:30:47.556290 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" path="/var/lib/kubelet/pods/9fd6eab3-bc4f-437c-ab20-8db15e2ec157/volumes" Oct 14 13:30:56.905384 master-2 kubenswrapper[4762]: I1014 13:30:56.905289 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:56.906258 master-2 kubenswrapper[4762]: I1014 13:30:56.906184 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:30:56.907487 master-2 kubenswrapper[4762]: I1014 13:30:56.907424 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-zvnk6" Oct 14 13:31:03.710704 master-2 kubenswrapper[4762]: I1014 13:31:03.710566 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/installer-5-master-2"] Oct 14 13:31:03.816223 master-2 kubenswrapper[4762]: I1014 13:31:03.816104 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/installer-5-master-2"] Oct 14 13:31:05.560869 master-2 kubenswrapper[4762]: I1014 13:31:05.560780 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b313e4-ea57-4f9c-ad72-1f640ef21c52" path="/var/lib/kubelet/pods/08b313e4-ea57-4f9c-ad72-1f640ef21c52/volumes" Oct 14 13:31:12.953858 master-2 kubenswrapper[4762]: I1014 13:31:12.953780 4762 scope.go:117] "RemoveContainer" containerID="d7d08b38f7e0af5214dbb27e0ea5e13ef41ff6f6d36bc1e8272e35547c0ac516" Oct 14 13:31:28.356499 master-2 kubenswrapper[4762]: I1014 13:31:28.356416 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr"] Oct 14 13:31:28.360454 master-2 kubenswrapper[4762]: E1014 13:31:28.356753 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" containerName="console" Oct 14 13:31:28.360454 master-2 kubenswrapper[4762]: I1014 13:31:28.356775 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" containerName="console" Oct 14 13:31:28.360454 master-2 kubenswrapper[4762]: I1014 13:31:28.356958 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd6eab3-bc4f-437c-ab20-8db15e2ec157" containerName="console" Oct 14 13:31:28.360454 master-2 kubenswrapper[4762]: I1014 13:31:28.358382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.362003 master-2 kubenswrapper[4762]: I1014 13:31:28.361953 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 14 13:31:28.362983 master-2 kubenswrapper[4762]: I1014 13:31:28.362941 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 14 13:31:28.379584 master-2 kubenswrapper[4762]: I1014 13:31:28.379501 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr"] Oct 14 13:31:28.450705 master-2 kubenswrapper[4762]: I1014 13:31:28.450628 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.450705 master-2 kubenswrapper[4762]: I1014 13:31:28.450678 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.450705 master-2 kubenswrapper[4762]: I1014 13:31:28.450734 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2lzf\" (UniqueName: \"kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.551927 master-2 kubenswrapper[4762]: I1014 13:31:28.551799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2lzf\" (UniqueName: \"kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.552424 master-2 kubenswrapper[4762]: I1014 13:31:28.551993 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.552424 master-2 kubenswrapper[4762]: I1014 13:31:28.552062 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.553414 master-2 kubenswrapper[4762]: I1014 13:31:28.552915 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.553414 master-2 kubenswrapper[4762]: I1014 13:31:28.553404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.576120 master-2 kubenswrapper[4762]: I1014 13:31:28.576024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2lzf\" (UniqueName: \"kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf\") pod \"32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:28.688585 master-2 kubenswrapper[4762]: I1014 13:31:28.688431 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:29.170446 master-2 kubenswrapper[4762]: I1014 13:31:29.170383 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr"] Oct 14 13:31:29.849936 master-2 kubenswrapper[4762]: I1014 13:31:29.849723 4762 generic.go:334] "Generic (PLEG): container finished" podID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerID="42cd28a8ca68c52856b222df522337f456d120da1320a548d4944c2e40b61a87" exitCode=0 Oct 14 13:31:29.849936 master-2 kubenswrapper[4762]: I1014 13:31:29.849815 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" event={"ID":"35a6c4c9-85b7-4742-a4bd-3b67b0b31608","Type":"ContainerDied","Data":"42cd28a8ca68c52856b222df522337f456d120da1320a548d4944c2e40b61a87"} Oct 14 13:31:29.849936 master-2 kubenswrapper[4762]: I1014 13:31:29.849885 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" event={"ID":"35a6c4c9-85b7-4742-a4bd-3b67b0b31608","Type":"ContainerStarted","Data":"dfa82213581b2594b4635a0b35fa90ba9f4d4b8d721ae68d199ff481bfcdeec3"} Oct 14 13:31:31.868616 master-2 kubenswrapper[4762]: I1014 13:31:31.868542 4762 generic.go:334] "Generic (PLEG): container finished" podID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerID="7634a42105698089e97e77f4c1712a76347b7f91f0d1109c25e8af504993d8e6" exitCode=0 Oct 14 13:31:31.868616 master-2 kubenswrapper[4762]: I1014 13:31:31.868586 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" event={"ID":"35a6c4c9-85b7-4742-a4bd-3b67b0b31608","Type":"ContainerDied","Data":"7634a42105698089e97e77f4c1712a76347b7f91f0d1109c25e8af504993d8e6"} Oct 14 13:31:32.880485 master-2 kubenswrapper[4762]: I1014 13:31:32.880386 4762 generic.go:334] "Generic (PLEG): container finished" podID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerID="8b9b662ba1f17b9e7c0054addaa79cbd138290a94456612a9528ff1a5744e5ff" exitCode=0 Oct 14 13:31:32.880485 master-2 kubenswrapper[4762]: I1014 13:31:32.880485 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" event={"ID":"35a6c4c9-85b7-4742-a4bd-3b67b0b31608","Type":"ContainerDied","Data":"8b9b662ba1f17b9e7c0054addaa79cbd138290a94456612a9528ff1a5744e5ff"} Oct 14 13:31:34.234650 master-2 kubenswrapper[4762]: I1014 13:31:34.234530 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:31:34.245586 master-2 kubenswrapper[4762]: I1014 13:31:34.245512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2lzf\" (UniqueName: \"kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf\") pod \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " Oct 14 13:31:34.245828 master-2 kubenswrapper[4762]: I1014 13:31:34.245610 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util\") pod \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " Oct 14 13:31:34.245828 master-2 kubenswrapper[4762]: I1014 13:31:34.245659 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle\") pod \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\" (UID: \"35a6c4c9-85b7-4742-a4bd-3b67b0b31608\") " Oct 14 13:31:34.246658 master-2 kubenswrapper[4762]: I1014 13:31:34.246596 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle" (OuterVolumeSpecName: "bundle") pod "35a6c4c9-85b7-4742-a4bd-3b67b0b31608" (UID: "35a6c4c9-85b7-4742-a4bd-3b67b0b31608"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:31:34.250120 master-2 kubenswrapper[4762]: I1014 13:31:34.250043 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf" (OuterVolumeSpecName: "kube-api-access-q2lzf") pod "35a6c4c9-85b7-4742-a4bd-3b67b0b31608" (UID: "35a6c4c9-85b7-4742-a4bd-3b67b0b31608"). InnerVolumeSpecName "kube-api-access-q2lzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:31:34.285044 master-2 kubenswrapper[4762]: I1014 13:31:34.284771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util" (OuterVolumeSpecName: "util") pod "35a6c4c9-85b7-4742-a4bd-3b67b0b31608" (UID: "35a6c4c9-85b7-4742-a4bd-3b67b0b31608"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:31:34.347438 master-2 kubenswrapper[4762]: I1014 13:31:34.347335 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2lzf\" (UniqueName: \"kubernetes.io/projected/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-kube-api-access-q2lzf\") on node \"master-2\" DevicePath \"\"" Oct 14 13:31:34.347438 master-2 kubenswrapper[4762]: I1014 13:31:34.347410 4762 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-util\") on node \"master-2\" DevicePath \"\"" Oct 14 13:31:34.347438 master-2 kubenswrapper[4762]: I1014 13:31:34.347430 4762 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35a6c4c9-85b7-4742-a4bd-3b67b0b31608-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:31:34.909531 master-2 kubenswrapper[4762]: I1014 13:31:34.909489 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" event={"ID":"35a6c4c9-85b7-4742-a4bd-3b67b0b31608","Type":"ContainerDied","Data":"dfa82213581b2594b4635a0b35fa90ba9f4d4b8d721ae68d199ff481bfcdeec3"} Oct 14 13:31:34.909943 master-2 kubenswrapper[4762]: I1014 13:31:34.909926 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa82213581b2594b4635a0b35fa90ba9f4d4b8d721ae68d199ff481bfcdeec3" Oct 14 13:31:34.910038 master-2 kubenswrapper[4762]: I1014 13:31:34.909596 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/32da80840a2017f27ed4ad61f02adc64a25aa18e8dad0409953372036ajskqr" Oct 14 13:32:30.147483 master-2 kubenswrapper[4762]: I1014 13:32:30.147407 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: E1014 13:32:30.147738 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="pull" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: I1014 13:32:30.147756 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="pull" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: E1014 13:32:30.147775 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="extract" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: I1014 13:32:30.147784 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="extract" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: E1014 13:32:30.147797 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="util" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: I1014 13:32:30.147806 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="util" Oct 14 13:32:30.148335 master-2 kubenswrapper[4762]: I1014 13:32:30.147968 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a6c4c9-85b7-4742-a4bd-3b67b0b31608" containerName="extract" Oct 14 13:32:30.149062 master-2 kubenswrapper[4762]: I1014 13:32:30.149035 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.185407 master-2 kubenswrapper[4762]: I1014 13:32:30.185344 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:30.199300 master-2 kubenswrapper[4762]: I1014 13:32:30.196276 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.199300 master-2 kubenswrapper[4762]: I1014 13:32:30.196371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlcr\" (UniqueName: \"kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.199300 master-2 kubenswrapper[4762]: I1014 13:32:30.196466 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.297319 master-2 kubenswrapper[4762]: I1014 13:32:30.297268 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.297561 master-2 kubenswrapper[4762]: I1014 13:32:30.297366 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.297561 master-2 kubenswrapper[4762]: I1014 13:32:30.297409 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlcr\" (UniqueName: \"kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.298288 master-2 kubenswrapper[4762]: I1014 13:32:30.298254 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.298574 master-2 kubenswrapper[4762]: I1014 13:32:30.298550 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.324989 master-2 kubenswrapper[4762]: I1014 13:32:30.324945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlcr\" (UniqueName: \"kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr\") pod \"redhat-marketplace-ck2g8\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.466303 master-2 kubenswrapper[4762]: I1014 13:32:30.466249 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:30.885819 master-2 kubenswrapper[4762]: I1014 13:32:30.885758 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:31.335763 master-2 kubenswrapper[4762]: I1014 13:32:31.335707 4762 generic.go:334] "Generic (PLEG): container finished" podID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerID="f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938" exitCode=0 Oct 14 13:32:31.335763 master-2 kubenswrapper[4762]: I1014 13:32:31.335762 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerDied","Data":"f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938"} Oct 14 13:32:31.336452 master-2 kubenswrapper[4762]: I1014 13:32:31.335794 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerStarted","Data":"7c6f93e01199e08b2673c37a3cee978d7fab2e0bf9b568e91f87f60c5f3d092a"} Oct 14 13:32:33.354284 master-2 kubenswrapper[4762]: I1014 13:32:33.354210 4762 generic.go:334] "Generic (PLEG): container finished" podID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerID="7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2" exitCode=0 Oct 14 13:32:33.354284 master-2 kubenswrapper[4762]: I1014 13:32:33.354276 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerDied","Data":"7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2"} Oct 14 13:32:34.366715 master-2 kubenswrapper[4762]: I1014 13:32:34.366630 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerStarted","Data":"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91"} Oct 14 13:32:40.466800 master-2 kubenswrapper[4762]: I1014 13:32:40.466693 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:40.466800 master-2 kubenswrapper[4762]: I1014 13:32:40.466787 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:40.511135 master-2 kubenswrapper[4762]: I1014 13:32:40.511046 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:40.544385 master-2 kubenswrapper[4762]: I1014 13:32:40.544246 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ck2g8" podStartSLOduration=8.987682668 podStartE2EDuration="11.544211289s" podCreationTimestamp="2025-10-14 13:32:29 +0000 UTC" firstStartedPulling="2025-10-14 13:32:31.338235099 +0000 UTC m=+1580.582394268" lastFinishedPulling="2025-10-14 13:32:33.89476372 +0000 UTC m=+1583.138922889" observedRunningTime="2025-10-14 13:32:34.397228757 +0000 UTC m=+1583.641387946" watchObservedRunningTime="2025-10-14 13:32:40.544211289 +0000 UTC m=+1589.788370498" Oct 14 13:32:41.478981 master-2 kubenswrapper[4762]: I1014 13:32:41.478838 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:41.562131 master-2 kubenswrapper[4762]: I1014 13:32:41.562087 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:43.435057 master-2 kubenswrapper[4762]: I1014 13:32:43.434952 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ck2g8" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="registry-server" containerID="cri-o://c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91" gracePeriod=2 Oct 14 13:32:43.914777 master-2 kubenswrapper[4762]: I1014 13:32:43.914741 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:44.097396 master-2 kubenswrapper[4762]: I1014 13:32:44.097311 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content\") pod \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " Oct 14 13:32:44.097644 master-2 kubenswrapper[4762]: I1014 13:32:44.097447 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twlcr\" (UniqueName: \"kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr\") pod \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " Oct 14 13:32:44.097751 master-2 kubenswrapper[4762]: I1014 13:32:44.097711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities\") pod \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\" (UID: \"25e93cdc-4bc4-4a8d-8a84-958ab56db170\") " Oct 14 13:32:44.099629 master-2 kubenswrapper[4762]: I1014 13:32:44.099546 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities" (OuterVolumeSpecName: "utilities") pod "25e93cdc-4bc4-4a8d-8a84-958ab56db170" (UID: "25e93cdc-4bc4-4a8d-8a84-958ab56db170"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:32:44.101990 master-2 kubenswrapper[4762]: I1014 13:32:44.101918 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr" (OuterVolumeSpecName: "kube-api-access-twlcr") pod "25e93cdc-4bc4-4a8d-8a84-958ab56db170" (UID: "25e93cdc-4bc4-4a8d-8a84-958ab56db170"). InnerVolumeSpecName "kube-api-access-twlcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:32:44.115131 master-2 kubenswrapper[4762]: I1014 13:32:44.114982 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25e93cdc-4bc4-4a8d-8a84-958ab56db170" (UID: "25e93cdc-4bc4-4a8d-8a84-958ab56db170"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:32:44.200078 master-2 kubenswrapper[4762]: I1014 13:32:44.199993 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:32:44.200078 master-2 kubenswrapper[4762]: I1014 13:32:44.200061 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25e93cdc-4bc4-4a8d-8a84-958ab56db170-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:32:44.200078 master-2 kubenswrapper[4762]: I1014 13:32:44.200084 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twlcr\" (UniqueName: \"kubernetes.io/projected/25e93cdc-4bc4-4a8d-8a84-958ab56db170-kube-api-access-twlcr\") on node \"master-2\" DevicePath \"\"" Oct 14 13:32:44.447003 master-2 kubenswrapper[4762]: I1014 13:32:44.446942 4762 generic.go:334] "Generic (PLEG): container finished" podID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerID="c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91" exitCode=0 Oct 14 13:32:44.447913 master-2 kubenswrapper[4762]: I1014 13:32:44.447066 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck2g8" Oct 14 13:32:44.448237 master-2 kubenswrapper[4762]: I1014 13:32:44.447044 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerDied","Data":"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91"} Oct 14 13:32:44.448391 master-2 kubenswrapper[4762]: I1014 13:32:44.448281 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck2g8" event={"ID":"25e93cdc-4bc4-4a8d-8a84-958ab56db170","Type":"ContainerDied","Data":"7c6f93e01199e08b2673c37a3cee978d7fab2e0bf9b568e91f87f60c5f3d092a"} Oct 14 13:32:44.448391 master-2 kubenswrapper[4762]: I1014 13:32:44.448338 4762 scope.go:117] "RemoveContainer" containerID="c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91" Oct 14 13:32:44.481878 master-2 kubenswrapper[4762]: I1014 13:32:44.481819 4762 scope.go:117] "RemoveContainer" containerID="7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2" Oct 14 13:32:44.513674 master-2 kubenswrapper[4762]: I1014 13:32:44.513519 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:44.520281 master-2 kubenswrapper[4762]: I1014 13:32:44.520212 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck2g8"] Oct 14 13:32:44.529833 master-2 kubenswrapper[4762]: I1014 13:32:44.529793 4762 scope.go:117] "RemoveContainer" containerID="f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938" Oct 14 13:32:44.549333 master-2 kubenswrapper[4762]: I1014 13:32:44.549292 4762 scope.go:117] "RemoveContainer" containerID="c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91" Oct 14 13:32:44.549900 master-2 kubenswrapper[4762]: E1014 13:32:44.549866 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91\": container with ID starting with c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91 not found: ID does not exist" containerID="c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91" Oct 14 13:32:44.550410 master-2 kubenswrapper[4762]: I1014 13:32:44.549909 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91"} err="failed to get container status \"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91\": rpc error: code = NotFound desc = could not find container \"c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91\": container with ID starting with c21b8fa21bd779470ce86fd58cfe97aec88422e238858493d5ffbba0ac555f91 not found: ID does not exist" Oct 14 13:32:44.550410 master-2 kubenswrapper[4762]: I1014 13:32:44.549957 4762 scope.go:117] "RemoveContainer" containerID="7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2" Oct 14 13:32:44.550906 master-2 kubenswrapper[4762]: E1014 13:32:44.550476 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2\": container with ID starting with 7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2 not found: ID does not exist" containerID="7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2" Oct 14 13:32:44.550906 master-2 kubenswrapper[4762]: I1014 13:32:44.550501 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2"} err="failed to get container status \"7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2\": rpc error: code = NotFound desc = could not find container \"7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2\": container with ID starting with 7d4ea971c8a6df4f1b2db7d2a4349147eac4dbd950b4373b29d2ea0809b7b8b2 not found: ID does not exist" Oct 14 13:32:44.550906 master-2 kubenswrapper[4762]: I1014 13:32:44.550517 4762 scope.go:117] "RemoveContainer" containerID="f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938" Oct 14 13:32:44.551261 master-2 kubenswrapper[4762]: E1014 13:32:44.550933 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938\": container with ID starting with f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938 not found: ID does not exist" containerID="f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938" Oct 14 13:32:44.551261 master-2 kubenswrapper[4762]: I1014 13:32:44.550960 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938"} err="failed to get container status \"f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938\": rpc error: code = NotFound desc = could not find container \"f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938\": container with ID starting with f70a1edabc3b31732d90c9a1d0780022c76b7c3334f20158f7fb35386407f938 not found: ID does not exist" Oct 14 13:32:45.559228 master-2 kubenswrapper[4762]: I1014 13:32:45.558271 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" path="/var/lib/kubelet/pods/25e93cdc-4bc4-4a8d-8a84-958ab56db170/volumes" Oct 14 13:32:51.263963 master-2 kubenswrapper[4762]: I1014 13:32:51.263899 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: E1014 13:32:51.264279 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="extract-content" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: I1014 13:32:51.264302 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="extract-content" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: E1014 13:32:51.264325 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="registry-server" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: I1014 13:32:51.264337 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="registry-server" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: E1014 13:32:51.264363 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="extract-utilities" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: I1014 13:32:51.264376 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="extract-utilities" Oct 14 13:32:51.265324 master-2 kubenswrapper[4762]: I1014 13:32:51.264566 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e93cdc-4bc4-4a8d-8a84-958ab56db170" containerName="registry-server" Oct 14 13:32:51.266143 master-2 kubenswrapper[4762]: I1014 13:32:51.266085 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.301026 master-2 kubenswrapper[4762]: I1014 13:32:51.300957 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:32:51.404948 master-2 kubenswrapper[4762]: I1014 13:32:51.404886 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.405262 master-2 kubenswrapper[4762]: I1014 13:32:51.405005 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklwq\" (UniqueName: \"kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.405262 master-2 kubenswrapper[4762]: I1014 13:32:51.405042 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.506823 master-2 kubenswrapper[4762]: I1014 13:32:51.506761 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.507061 master-2 kubenswrapper[4762]: I1014 13:32:51.506999 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dklwq\" (UniqueName: \"kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.507061 master-2 kubenswrapper[4762]: I1014 13:32:51.507038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.507498 master-2 kubenswrapper[4762]: I1014 13:32:51.507386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.507571 master-2 kubenswrapper[4762]: I1014 13:32:51.507513 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.547417 master-2 kubenswrapper[4762]: I1014 13:32:51.547284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklwq\" (UniqueName: \"kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq\") pod \"community-operators-5zr4r\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:51.583231 master-2 kubenswrapper[4762]: I1014 13:32:51.583075 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:32:52.035813 master-2 kubenswrapper[4762]: W1014 13:32:52.035760 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dcb722_c38a_4865_b817_c7f4b4711198.slice/crio-41a48d1e334a4774564cf394ed62e260e02188e7984f8a9044efe670fe3e27d0 WatchSource:0}: Error finding container 41a48d1e334a4774564cf394ed62e260e02188e7984f8a9044efe670fe3e27d0: Status 404 returned error can't find the container with id 41a48d1e334a4774564cf394ed62e260e02188e7984f8a9044efe670fe3e27d0 Oct 14 13:32:52.088888 master-2 kubenswrapper[4762]: I1014 13:32:52.088817 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:32:52.514759 master-2 kubenswrapper[4762]: I1014 13:32:52.514684 4762 generic.go:334] "Generic (PLEG): container finished" podID="25dcb722-c38a-4865-b817-c7f4b4711198" containerID="7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754" exitCode=0 Oct 14 13:32:52.514759 master-2 kubenswrapper[4762]: I1014 13:32:52.514743 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerDied","Data":"7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754"} Oct 14 13:32:52.515445 master-2 kubenswrapper[4762]: I1014 13:32:52.514778 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerStarted","Data":"41a48d1e334a4774564cf394ed62e260e02188e7984f8a9044efe670fe3e27d0"} Oct 14 13:32:54.538748 master-2 kubenswrapper[4762]: I1014 13:32:54.535332 4762 generic.go:334] "Generic (PLEG): container finished" podID="25dcb722-c38a-4865-b817-c7f4b4711198" containerID="cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13" exitCode=0 Oct 14 13:32:54.538748 master-2 kubenswrapper[4762]: I1014 13:32:54.535408 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerDied","Data":"cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13"} Oct 14 13:32:55.545642 master-2 kubenswrapper[4762]: I1014 13:32:55.545598 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerStarted","Data":"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162"} Oct 14 13:32:55.583827 master-2 kubenswrapper[4762]: I1014 13:32:55.583737 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zr4r" podStartSLOduration=2.151376612 podStartE2EDuration="4.583714575s" podCreationTimestamp="2025-10-14 13:32:51 +0000 UTC" firstStartedPulling="2025-10-14 13:32:52.516901531 +0000 UTC m=+1601.761060690" lastFinishedPulling="2025-10-14 13:32:54.949239494 +0000 UTC m=+1604.193398653" observedRunningTime="2025-10-14 13:32:55.582423014 +0000 UTC m=+1604.826582203" watchObservedRunningTime="2025-10-14 13:32:55.583714575 +0000 UTC m=+1604.827873724" Oct 14 13:33:01.583812 master-2 kubenswrapper[4762]: I1014 13:33:01.583720 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:01.584898 master-2 kubenswrapper[4762]: I1014 13:33:01.584695 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:01.625384 master-2 kubenswrapper[4762]: I1014 13:33:01.625321 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:02.642379 master-2 kubenswrapper[4762]: I1014 13:33:02.642300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:05.335885 master-2 kubenswrapper[4762]: I1014 13:33:05.335821 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg"] Oct 14 13:33:05.336931 master-2 kubenswrapper[4762]: I1014 13:33:05.336899 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:05.339448 master-2 kubenswrapper[4762]: I1014 13:33:05.339396 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 14 13:33:05.339709 master-2 kubenswrapper[4762]: I1014 13:33:05.339638 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 14 13:33:05.363699 master-2 kubenswrapper[4762]: I1014 13:33:05.363641 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg"] Oct 14 13:33:05.393521 master-2 kubenswrapper[4762]: I1014 13:33:05.393460 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6"] Oct 14 13:33:05.394679 master-2 kubenswrapper[4762]: I1014 13:33:05.394650 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:05.420837 master-2 kubenswrapper[4762]: I1014 13:33:05.420740 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6"] Oct 14 13:33:05.441916 master-2 kubenswrapper[4762]: I1014 13:33:05.439765 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrdh2\" (UniqueName: \"kubernetes.io/projected/70c2f56d-0890-4306-acda-44d6bba8a4b6-kube-api-access-rrdh2\") pod \"ironic-operator-controller-manager-6b498574d4-tcqkg\" (UID: \"70c2f56d-0890-4306-acda-44d6bba8a4b6\") " pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:05.441916 master-2 kubenswrapper[4762]: I1014 13:33:05.439907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fkd\" (UniqueName: \"kubernetes.io/projected/71265a42-d499-4805-b432-826285d2adca-kube-api-access-98fkd\") pod \"manila-operator-controller-manager-6d78f57554-t6sj6\" (UID: \"71265a42-d499-4805-b432-826285d2adca\") " pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:05.502421 master-2 kubenswrapper[4762]: I1014 13:33:05.502350 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm"] Oct 14 13:33:05.504421 master-2 kubenswrapper[4762]: I1014 13:33:05.503870 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:05.526291 master-2 kubenswrapper[4762]: I1014 13:33:05.526249 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm"] Oct 14 13:33:05.541691 master-2 kubenswrapper[4762]: I1014 13:33:05.541622 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrdh2\" (UniqueName: \"kubernetes.io/projected/70c2f56d-0890-4306-acda-44d6bba8a4b6-kube-api-access-rrdh2\") pod \"ironic-operator-controller-manager-6b498574d4-tcqkg\" (UID: \"70c2f56d-0890-4306-acda-44d6bba8a4b6\") " pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:05.541902 master-2 kubenswrapper[4762]: I1014 13:33:05.541743 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fkd\" (UniqueName: \"kubernetes.io/projected/71265a42-d499-4805-b432-826285d2adca-kube-api-access-98fkd\") pod \"manila-operator-controller-manager-6d78f57554-t6sj6\" (UID: \"71265a42-d499-4805-b432-826285d2adca\") " pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:05.588141 master-2 kubenswrapper[4762]: I1014 13:33:05.588056 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fkd\" (UniqueName: \"kubernetes.io/projected/71265a42-d499-4805-b432-826285d2adca-kube-api-access-98fkd\") pod \"manila-operator-controller-manager-6d78f57554-t6sj6\" (UID: \"71265a42-d499-4805-b432-826285d2adca\") " pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:05.591833 master-2 kubenswrapper[4762]: I1014 13:33:05.591805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrdh2\" (UniqueName: \"kubernetes.io/projected/70c2f56d-0890-4306-acda-44d6bba8a4b6-kube-api-access-rrdh2\") pod \"ironic-operator-controller-manager-6b498574d4-tcqkg\" (UID: \"70c2f56d-0890-4306-acda-44d6bba8a4b6\") " pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:05.605816 master-2 kubenswrapper[4762]: I1014 13:33:05.605770 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td"] Oct 14 13:33:05.607859 master-2 kubenswrapper[4762]: I1014 13:33:05.607839 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.610956 master-2 kubenswrapper[4762]: I1014 13:33:05.610914 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 14 13:33:05.643411 master-2 kubenswrapper[4762]: I1014 13:33:05.643357 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jjr\" (UniqueName: \"kubernetes.io/projected/a5b17b54-b33c-4bed-bbd7-33e42a901d01-kube-api-access-w2jjr\") pod \"neutron-operator-controller-manager-7c95684bcc-qn2dm\" (UID: \"a5b17b54-b33c-4bed-bbd7-33e42a901d01\") " pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:05.653363 master-2 kubenswrapper[4762]: I1014 13:33:05.653303 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:05.667728 master-2 kubenswrapper[4762]: I1014 13:33:05.667670 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td"] Oct 14 13:33:05.711583 master-2 kubenswrapper[4762]: I1014 13:33:05.710888 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:05.745220 master-2 kubenswrapper[4762]: I1014 13:33:05.745121 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wlzj\" (UniqueName: \"kubernetes.io/projected/e1c7f086-7d85-4a69-b495-13a1d0f55342-kube-api-access-8wlzj\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.745220 master-2 kubenswrapper[4762]: I1014 13:33:05.745192 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.745518 master-2 kubenswrapper[4762]: I1014 13:33:05.745314 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jjr\" (UniqueName: \"kubernetes.io/projected/a5b17b54-b33c-4bed-bbd7-33e42a901d01-kube-api-access-w2jjr\") pod \"neutron-operator-controller-manager-7c95684bcc-qn2dm\" (UID: \"a5b17b54-b33c-4bed-bbd7-33e42a901d01\") " pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:05.805908 master-2 kubenswrapper[4762]: I1014 13:33:05.804375 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn"] Oct 14 13:33:05.805908 master-2 kubenswrapper[4762]: I1014 13:33:05.805425 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:05.805908 master-2 kubenswrapper[4762]: I1014 13:33:05.805483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jjr\" (UniqueName: \"kubernetes.io/projected/a5b17b54-b33c-4bed-bbd7-33e42a901d01-kube-api-access-w2jjr\") pod \"neutron-operator-controller-manager-7c95684bcc-qn2dm\" (UID: \"a5b17b54-b33c-4bed-bbd7-33e42a901d01\") " pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:05.824506 master-2 kubenswrapper[4762]: I1014 13:33:05.824442 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:05.844962 master-2 kubenswrapper[4762]: I1014 13:33:05.843257 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn"] Oct 14 13:33:05.846591 master-2 kubenswrapper[4762]: I1014 13:33:05.846539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wlzj\" (UniqueName: \"kubernetes.io/projected/e1c7f086-7d85-4a69-b495-13a1d0f55342-kube-api-access-8wlzj\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.846675 master-2 kubenswrapper[4762]: I1014 13:33:05.846585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.846895 master-2 kubenswrapper[4762]: E1014 13:33:05.846854 4762 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:33:05.847015 master-2 kubenswrapper[4762]: E1014 13:33:05.846953 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert podName:e1c7f086-7d85-4a69-b495-13a1d0f55342 nodeName:}" failed. No retries permitted until 2025-10-14 13:33:06.346913404 +0000 UTC m=+1615.591072563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert") pod "openstack-baremetal-operator-controller-manager-69958697d76f9td" (UID: "e1c7f086-7d85-4a69-b495-13a1d0f55342") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:33:05.878740 master-2 kubenswrapper[4762]: I1014 13:33:05.878655 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wlzj\" (UniqueName: \"kubernetes.io/projected/e1c7f086-7d85-4a69-b495-13a1d0f55342-kube-api-access-8wlzj\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:05.948291 master-2 kubenswrapper[4762]: I1014 13:33:05.948248 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64fwl\" (UniqueName: \"kubernetes.io/projected/7ecf4d52-9004-4c21-96f0-f50233a7aad2-kube-api-access-64fwl\") pod \"test-operator-controller-manager-565dfd7bb9-c6fnn\" (UID: \"7ecf4d52-9004-4c21-96f0-f50233a7aad2\") " pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:06.013061 master-2 kubenswrapper[4762]: I1014 13:33:06.012552 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:33:06.013061 master-2 kubenswrapper[4762]: I1014 13:33:06.012916 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zr4r" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="registry-server" containerID="cri-o://db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162" gracePeriod=2 Oct 14 13:33:06.049180 master-2 kubenswrapper[4762]: I1014 13:33:06.049124 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64fwl\" (UniqueName: \"kubernetes.io/projected/7ecf4d52-9004-4c21-96f0-f50233a7aad2-kube-api-access-64fwl\") pod \"test-operator-controller-manager-565dfd7bb9-c6fnn\" (UID: \"7ecf4d52-9004-4c21-96f0-f50233a7aad2\") " pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:06.066019 master-2 kubenswrapper[4762]: I1014 13:33:06.065943 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89"] Oct 14 13:33:06.067514 master-2 kubenswrapper[4762]: I1014 13:33:06.067480 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.070308 master-2 kubenswrapper[4762]: I1014 13:33:06.070264 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 14 13:33:06.074746 master-2 kubenswrapper[4762]: I1014 13:33:06.074704 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64fwl\" (UniqueName: \"kubernetes.io/projected/7ecf4d52-9004-4c21-96f0-f50233a7aad2-kube-api-access-64fwl\") pod \"test-operator-controller-manager-565dfd7bb9-c6fnn\" (UID: \"7ecf4d52-9004-4c21-96f0-f50233a7aad2\") " pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:06.089258 master-2 kubenswrapper[4762]: I1014 13:33:06.089204 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89"] Oct 14 13:33:06.116717 master-2 kubenswrapper[4762]: I1014 13:33:06.116667 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg"] Oct 14 13:33:06.131524 master-2 kubenswrapper[4762]: W1014 13:33:06.130615 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c2f56d_0890_4306_acda_44d6bba8a4b6.slice/crio-7d860742f69feb98e39b73ef04c11017a7fad34777f3cfdf6863e8c30f5aaa05 WatchSource:0}: Error finding container 7d860742f69feb98e39b73ef04c11017a7fad34777f3cfdf6863e8c30f5aaa05: Status 404 returned error can't find the container with id 7d860742f69feb98e39b73ef04c11017a7fad34777f3cfdf6863e8c30f5aaa05 Oct 14 13:33:06.135688 master-2 kubenswrapper[4762]: I1014 13:33:06.135652 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:06.150128 master-2 kubenswrapper[4762]: I1014 13:33:06.150076 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.150593 master-2 kubenswrapper[4762]: I1014 13:33:06.150206 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkqd\" (UniqueName: \"kubernetes.io/projected/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-kube-api-access-9kkqd\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.215053 master-2 kubenswrapper[4762]: I1014 13:33:06.215002 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6"] Oct 14 13:33:06.251689 master-2 kubenswrapper[4762]: I1014 13:33:06.251631 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.251843 master-2 kubenswrapper[4762]: I1014 13:33:06.251720 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkqd\" (UniqueName: \"kubernetes.io/projected/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-kube-api-access-9kkqd\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.252177 master-2 kubenswrapper[4762]: E1014 13:33:06.252002 4762 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 14 13:33:06.252257 master-2 kubenswrapper[4762]: E1014 13:33:06.252203 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert podName:71a2ab80-19dc-4e7c-9257-ff70d1c5b898 nodeName:}" failed. No retries permitted until 2025-10-14 13:33:06.752146548 +0000 UTC m=+1615.996305777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert") pod "openstack-operator-controller-manager-6566ff98d5-wbc89" (UID: "71a2ab80-19dc-4e7c-9257-ff70d1c5b898") : secret "webhook-server-cert" not found Oct 14 13:33:06.276977 master-2 kubenswrapper[4762]: I1014 13:33:06.276895 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkqd\" (UniqueName: \"kubernetes.io/projected/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-kube-api-access-9kkqd\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.310461 master-2 kubenswrapper[4762]: I1014 13:33:06.310414 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm"] Oct 14 13:33:06.318244 master-2 kubenswrapper[4762]: W1014 13:33:06.318182 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b17b54_b33c_4bed_bbd7_33e42a901d01.slice/crio-e6520b4463b009ed7f3f9b1b4051c1293075697d1983079c726180c9186a053b WatchSource:0}: Error finding container e6520b4463b009ed7f3f9b1b4051c1293075697d1983079c726180c9186a053b: Status 404 returned error can't find the container with id e6520b4463b009ed7f3f9b1b4051c1293075697d1983079c726180c9186a053b Oct 14 13:33:06.352819 master-2 kubenswrapper[4762]: I1014 13:33:06.352750 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:06.353364 master-2 kubenswrapper[4762]: E1014 13:33:06.353002 4762 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:33:06.353364 master-2 kubenswrapper[4762]: E1014 13:33:06.353058 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert podName:e1c7f086-7d85-4a69-b495-13a1d0f55342 nodeName:}" failed. No retries permitted until 2025-10-14 13:33:07.353042596 +0000 UTC m=+1616.597201755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert") pod "openstack-baremetal-operator-controller-manager-69958697d76f9td" (UID: "e1c7f086-7d85-4a69-b495-13a1d0f55342") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 14 13:33:06.383202 master-2 kubenswrapper[4762]: I1014 13:33:06.381775 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:06.556395 master-2 kubenswrapper[4762]: I1014 13:33:06.555938 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities\") pod \"25dcb722-c38a-4865-b817-c7f4b4711198\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " Oct 14 13:33:06.556395 master-2 kubenswrapper[4762]: I1014 13:33:06.556010 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content\") pod \"25dcb722-c38a-4865-b817-c7f4b4711198\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " Oct 14 13:33:06.556395 master-2 kubenswrapper[4762]: I1014 13:33:06.556066 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dklwq\" (UniqueName: \"kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq\") pod \"25dcb722-c38a-4865-b817-c7f4b4711198\" (UID: \"25dcb722-c38a-4865-b817-c7f4b4711198\") " Oct 14 13:33:06.558660 master-2 kubenswrapper[4762]: I1014 13:33:06.558575 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities" (OuterVolumeSpecName: "utilities") pod "25dcb722-c38a-4865-b817-c7f4b4711198" (UID: "25dcb722-c38a-4865-b817-c7f4b4711198"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:33:06.560453 master-2 kubenswrapper[4762]: I1014 13:33:06.560384 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq" (OuterVolumeSpecName: "kube-api-access-dklwq") pod "25dcb722-c38a-4865-b817-c7f4b4711198" (UID: "25dcb722-c38a-4865-b817-c7f4b4711198"). InnerVolumeSpecName "kube-api-access-dklwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:06.621270 master-2 kubenswrapper[4762]: I1014 13:33:06.619710 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn"] Oct 14 13:33:06.632970 master-2 kubenswrapper[4762]: I1014 13:33:06.630855 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" event={"ID":"71265a42-d499-4805-b432-826285d2adca","Type":"ContainerStarted","Data":"66ff0ea37606330abc0faa621ff9e8b446332598127cdba27daa442fdb3d080f"} Oct 14 13:33:06.632970 master-2 kubenswrapper[4762]: I1014 13:33:06.632875 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25dcb722-c38a-4865-b817-c7f4b4711198" (UID: "25dcb722-c38a-4865-b817-c7f4b4711198"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:33:06.636684 master-2 kubenswrapper[4762]: I1014 13:33:06.636478 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" event={"ID":"a5b17b54-b33c-4bed-bbd7-33e42a901d01","Type":"ContainerStarted","Data":"e6520b4463b009ed7f3f9b1b4051c1293075697d1983079c726180c9186a053b"} Oct 14 13:33:06.642449 master-2 kubenswrapper[4762]: I1014 13:33:06.642377 4762 generic.go:334] "Generic (PLEG): container finished" podID="25dcb722-c38a-4865-b817-c7f4b4711198" containerID="db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162" exitCode=0 Oct 14 13:33:06.642576 master-2 kubenswrapper[4762]: I1014 13:33:06.642483 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerDied","Data":"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162"} Oct 14 13:33:06.642576 master-2 kubenswrapper[4762]: I1014 13:33:06.642538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zr4r" event={"ID":"25dcb722-c38a-4865-b817-c7f4b4711198","Type":"ContainerDied","Data":"41a48d1e334a4774564cf394ed62e260e02188e7984f8a9044efe670fe3e27d0"} Oct 14 13:33:06.642576 master-2 kubenswrapper[4762]: I1014 13:33:06.642557 4762 scope.go:117] "RemoveContainer" containerID="db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162" Oct 14 13:33:06.642796 master-2 kubenswrapper[4762]: I1014 13:33:06.642725 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zr4r" Oct 14 13:33:06.654088 master-2 kubenswrapper[4762]: I1014 13:33:06.653961 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" event={"ID":"70c2f56d-0890-4306-acda-44d6bba8a4b6","Type":"ContainerStarted","Data":"7d860742f69feb98e39b73ef04c11017a7fad34777f3cfdf6863e8c30f5aaa05"} Oct 14 13:33:06.657654 master-2 kubenswrapper[4762]: I1014 13:33:06.657600 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:33:06.657654 master-2 kubenswrapper[4762]: I1014 13:33:06.657643 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dcb722-c38a-4865-b817-c7f4b4711198-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:33:06.657654 master-2 kubenswrapper[4762]: I1014 13:33:06.657658 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dklwq\" (UniqueName: \"kubernetes.io/projected/25dcb722-c38a-4865-b817-c7f4b4711198-kube-api-access-dklwq\") on node \"master-2\" DevicePath \"\"" Oct 14 13:33:06.673962 master-2 kubenswrapper[4762]: I1014 13:33:06.673788 4762 scope.go:117] "RemoveContainer" containerID="cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13" Oct 14 13:33:06.689239 master-2 kubenswrapper[4762]: I1014 13:33:06.689120 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:33:06.694175 master-2 kubenswrapper[4762]: I1014 13:33:06.694135 4762 scope.go:117] "RemoveContainer" containerID="7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754" Oct 14 13:33:06.713596 master-2 kubenswrapper[4762]: I1014 13:33:06.713539 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zr4r"] Oct 14 13:33:06.737701 master-2 kubenswrapper[4762]: I1014 13:33:06.737655 4762 scope.go:117] "RemoveContainer" containerID="db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162" Oct 14 13:33:06.738237 master-2 kubenswrapper[4762]: E1014 13:33:06.738143 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162\": container with ID starting with db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162 not found: ID does not exist" containerID="db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162" Oct 14 13:33:06.738314 master-2 kubenswrapper[4762]: I1014 13:33:06.738238 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162"} err="failed to get container status \"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162\": rpc error: code = NotFound desc = could not find container \"db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162\": container with ID starting with db70f0d28de399fc7a7ae793a6f25bcd3f9bf84e8cbf09db4211489032944162 not found: ID does not exist" Oct 14 13:33:06.738314 master-2 kubenswrapper[4762]: I1014 13:33:06.738265 4762 scope.go:117] "RemoveContainer" containerID="cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13" Oct 14 13:33:06.738679 master-2 kubenswrapper[4762]: E1014 13:33:06.738632 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13\": container with ID starting with cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13 not found: ID does not exist" containerID="cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13" Oct 14 13:33:06.738736 master-2 kubenswrapper[4762]: I1014 13:33:06.738683 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13"} err="failed to get container status \"cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13\": rpc error: code = NotFound desc = could not find container \"cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13\": container with ID starting with cef60c01d1296839ec43004b55465ce26cd274669e83f09323613ab5162e5b13 not found: ID does not exist" Oct 14 13:33:06.738736 master-2 kubenswrapper[4762]: I1014 13:33:06.738717 4762 scope.go:117] "RemoveContainer" containerID="7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754" Oct 14 13:33:06.739217 master-2 kubenswrapper[4762]: E1014 13:33:06.739180 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754\": container with ID starting with 7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754 not found: ID does not exist" containerID="7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754" Oct 14 13:33:06.739282 master-2 kubenswrapper[4762]: I1014 13:33:06.739213 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754"} err="failed to get container status \"7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754\": rpc error: code = NotFound desc = could not find container \"7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754\": container with ID starting with 7390cb709ef9964bad2d97e82a5da8874ae248f1736449abf67b18f982bba754 not found: ID does not exist" Oct 14 13:33:06.759582 master-2 kubenswrapper[4762]: I1014 13:33:06.759531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:06.759950 master-2 kubenswrapper[4762]: E1014 13:33:06.759785 4762 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 14 13:33:06.759950 master-2 kubenswrapper[4762]: E1014 13:33:06.759921 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert podName:71a2ab80-19dc-4e7c-9257-ff70d1c5b898 nodeName:}" failed. No retries permitted until 2025-10-14 13:33:07.759898362 +0000 UTC m=+1617.004057531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert") pod "openstack-operator-controller-manager-6566ff98d5-wbc89" (UID: "71a2ab80-19dc-4e7c-9257-ff70d1c5b898") : secret "webhook-server-cert" not found Oct 14 13:33:07.366693 master-2 kubenswrapper[4762]: I1014 13:33:07.366618 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:07.371502 master-2 kubenswrapper[4762]: I1014 13:33:07.371457 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c7f086-7d85-4a69-b495-13a1d0f55342-cert\") pod \"openstack-baremetal-operator-controller-manager-69958697d76f9td\" (UID: \"e1c7f086-7d85-4a69-b495-13a1d0f55342\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:07.452935 master-2 kubenswrapper[4762]: I1014 13:33:07.452870 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:07.561614 master-2 kubenswrapper[4762]: I1014 13:33:07.561561 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" path="/var/lib/kubelet/pods/25dcb722-c38a-4865-b817-c7f4b4711198/volumes" Oct 14 13:33:07.661102 master-2 kubenswrapper[4762]: I1014 13:33:07.661037 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" event={"ID":"7ecf4d52-9004-4c21-96f0-f50233a7aad2","Type":"ContainerStarted","Data":"6a68c95be1c066ad376198d46256bc4cacff4dee63cf1aec57151290a5530992"} Oct 14 13:33:07.778107 master-2 kubenswrapper[4762]: I1014 13:33:07.777990 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:07.781972 master-2 kubenswrapper[4762]: I1014 13:33:07.781950 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71a2ab80-19dc-4e7c-9257-ff70d1c5b898-cert\") pod \"openstack-operator-controller-manager-6566ff98d5-wbc89\" (UID: \"71a2ab80-19dc-4e7c-9257-ff70d1c5b898\") " pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:07.875061 master-2 kubenswrapper[4762]: I1014 13:33:07.874987 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td"] Oct 14 13:33:07.943255 master-2 kubenswrapper[4762]: I1014 13:33:07.942847 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:08.959210 master-2 kubenswrapper[4762]: W1014 13:33:08.956994 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c7f086_7d85_4a69_b495_13a1d0f55342.slice/crio-72763208dccc953f25bf789c3bf826412bb3a4faa53010bfafb912395e62ca6c WatchSource:0}: Error finding container 72763208dccc953f25bf789c3bf826412bb3a4faa53010bfafb912395e62ca6c: Status 404 returned error can't find the container with id 72763208dccc953f25bf789c3bf826412bb3a4faa53010bfafb912395e62ca6c Oct 14 13:33:09.680199 master-2 kubenswrapper[4762]: I1014 13:33:09.679239 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" event={"ID":"e1c7f086-7d85-4a69-b495-13a1d0f55342","Type":"ContainerStarted","Data":"72763208dccc953f25bf789c3bf826412bb3a4faa53010bfafb912395e62ca6c"} Oct 14 13:33:10.644326 master-2 kubenswrapper[4762]: I1014 13:33:10.644249 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89"] Oct 14 13:33:10.679714 master-2 kubenswrapper[4762]: W1014 13:33:10.679658 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a2ab80_19dc_4e7c_9257_ff70d1c5b898.slice/crio-65a58adc928d21057d1a81229a93e4a8c4bf0f7a0f3cbdde1ab4a40eacf79462 WatchSource:0}: Error finding container 65a58adc928d21057d1a81229a93e4a8c4bf0f7a0f3cbdde1ab4a40eacf79462: Status 404 returned error can't find the container with id 65a58adc928d21057d1a81229a93e4a8c4bf0f7a0f3cbdde1ab4a40eacf79462 Oct 14 13:33:11.699327 master-2 kubenswrapper[4762]: I1014 13:33:11.699220 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" event={"ID":"71a2ab80-19dc-4e7c-9257-ff70d1c5b898","Type":"ContainerStarted","Data":"65a58adc928d21057d1a81229a93e4a8c4bf0f7a0f3cbdde1ab4a40eacf79462"} Oct 14 13:33:11.708487 master-2 kubenswrapper[4762]: I1014 13:33:11.700827 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" event={"ID":"71265a42-d499-4805-b432-826285d2adca","Type":"ContainerStarted","Data":"3dc2af35bd447da5fefcd00d245468b500fce0573c0c2bb645a7d18fcffd363e"} Oct 14 13:33:11.708487 master-2 kubenswrapper[4762]: I1014 13:33:11.702660 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" event={"ID":"70c2f56d-0890-4306-acda-44d6bba8a4b6","Type":"ContainerStarted","Data":"f28b9e007ae462e4b89338e5cef4f82d3f942ca56bd49c98dfb418700c8a7381"} Oct 14 13:33:11.708487 master-2 kubenswrapper[4762]: I1014 13:33:11.704227 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" event={"ID":"a5b17b54-b33c-4bed-bbd7-33e42a901d01","Type":"ContainerStarted","Data":"66e34e73976d184b9e34d2f48df5e94687129d83f6ee51b322a64a720839ca04"} Oct 14 13:33:12.713329 master-2 kubenswrapper[4762]: I1014 13:33:12.713266 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" event={"ID":"7ecf4d52-9004-4c21-96f0-f50233a7aad2","Type":"ContainerStarted","Data":"37d98b7a8e51083cb1b3ab58fe859b186543507b0fdc05ec5aa779d0de6774bf"} Oct 14 13:33:12.716238 master-2 kubenswrapper[4762]: I1014 13:33:12.716173 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" event={"ID":"e1c7f086-7d85-4a69-b495-13a1d0f55342","Type":"ContainerStarted","Data":"4cca729d6186c62de641f169cb7a99637e014fb9f050fcfdff79db8f950f9d85"} Oct 14 13:33:15.741214 master-2 kubenswrapper[4762]: I1014 13:33:15.741116 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" event={"ID":"71a2ab80-19dc-4e7c-9257-ff70d1c5b898","Type":"ContainerStarted","Data":"ce6b6651c3a478ead40ffb27d9289c96f136453cd50a5d42b5920637c06a1adf"} Oct 14 13:33:15.741214 master-2 kubenswrapper[4762]: I1014 13:33:15.741219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" event={"ID":"71a2ab80-19dc-4e7c-9257-ff70d1c5b898","Type":"ContainerStarted","Data":"4a07fa402e4e95cb56feced7d80228060101d15649112d8d09ad480226f630bd"} Oct 14 13:33:15.741869 master-2 kubenswrapper[4762]: I1014 13:33:15.741323 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:15.743989 master-2 kubenswrapper[4762]: I1014 13:33:15.743893 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" event={"ID":"71265a42-d499-4805-b432-826285d2adca","Type":"ContainerStarted","Data":"5bd74721a3f4023a802b4b08c1d9441330cf52d71f1ba349dae188bd1ef934e7"} Oct 14 13:33:15.744080 master-2 kubenswrapper[4762]: I1014 13:33:15.743999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:15.746464 master-2 kubenswrapper[4762]: I1014 13:33:15.746426 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" event={"ID":"70c2f56d-0890-4306-acda-44d6bba8a4b6","Type":"ContainerStarted","Data":"1c70210918addbab4e685ea35090142cdadb8dbb7b2e04b3d6cfa66ea4d8e23e"} Oct 14 13:33:15.746529 master-2 kubenswrapper[4762]: I1014 13:33:15.746485 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:15.747608 master-2 kubenswrapper[4762]: I1014 13:33:15.746910 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" Oct 14 13:33:15.748854 master-2 kubenswrapper[4762]: I1014 13:33:15.748827 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" Oct 14 13:33:15.749467 master-2 kubenswrapper[4762]: I1014 13:33:15.749432 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" event={"ID":"a5b17b54-b33c-4bed-bbd7-33e42a901d01","Type":"ContainerStarted","Data":"fcb25ee542613e4cf388d1096cbefd1cd1559af2cd6e07b3f0c90c4ce6ea8283"} Oct 14 13:33:15.750584 master-2 kubenswrapper[4762]: I1014 13:33:15.750530 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:15.752098 master-2 kubenswrapper[4762]: I1014 13:33:15.752058 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" Oct 14 13:33:15.752978 master-2 kubenswrapper[4762]: I1014 13:33:15.752954 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" event={"ID":"7ecf4d52-9004-4c21-96f0-f50233a7aad2","Type":"ContainerStarted","Data":"0a013cad955fbaffb3caa4dd983763beb92edfaa7909912625f417062b7ab28e"} Oct 14 13:33:15.753095 master-2 kubenswrapper[4762]: I1014 13:33:15.753076 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:15.756460 master-2 kubenswrapper[4762]: I1014 13:33:15.756396 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" event={"ID":"e1c7f086-7d85-4a69-b495-13a1d0f55342","Type":"ContainerStarted","Data":"73888247360a5a8c621cb2927168e5025a061c49e13dd08f6c32e31972e544d8"} Oct 14 13:33:15.756603 master-2 kubenswrapper[4762]: I1014 13:33:15.756575 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:15.789024 master-2 kubenswrapper[4762]: I1014 13:33:15.788908 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" podStartSLOduration=5.917120565 podStartE2EDuration="9.788870472s" podCreationTimestamp="2025-10-14 13:33:06 +0000 UTC" firstStartedPulling="2025-10-14 13:33:10.688639486 +0000 UTC m=+1619.932798645" lastFinishedPulling="2025-10-14 13:33:14.560389393 +0000 UTC m=+1623.804548552" observedRunningTime="2025-10-14 13:33:15.777292264 +0000 UTC m=+1625.021451513" watchObservedRunningTime="2025-10-14 13:33:15.788870472 +0000 UTC m=+1625.033029681" Oct 14 13:33:16.007733 master-2 kubenswrapper[4762]: I1014 13:33:16.007502 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" podStartSLOduration=5.40634497 podStartE2EDuration="11.007480211s" podCreationTimestamp="2025-10-14 13:33:05 +0000 UTC" firstStartedPulling="2025-10-14 13:33:08.959598693 +0000 UTC m=+1618.203757852" lastFinishedPulling="2025-10-14 13:33:14.560733934 +0000 UTC m=+1623.804893093" observedRunningTime="2025-10-14 13:33:15.818240132 +0000 UTC m=+1625.062399301" watchObservedRunningTime="2025-10-14 13:33:16.007480211 +0000 UTC m=+1625.251639390" Oct 14 13:33:16.010223 master-2 kubenswrapper[4762]: I1014 13:33:16.010147 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b498574d4-tcqkg" podStartSLOduration=2.660156856 podStartE2EDuration="11.010138335s" podCreationTimestamp="2025-10-14 13:33:05 +0000 UTC" firstStartedPulling="2025-10-14 13:33:06.133950772 +0000 UTC m=+1615.378109931" lastFinishedPulling="2025-10-14 13:33:14.483932251 +0000 UTC m=+1623.728091410" observedRunningTime="2025-10-14 13:33:15.999078754 +0000 UTC m=+1625.243237973" watchObservedRunningTime="2025-10-14 13:33:16.010138335 +0000 UTC m=+1625.254297504" Oct 14 13:33:16.139431 master-2 kubenswrapper[4762]: I1014 13:33:16.139316 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" Oct 14 13:33:16.663536 master-2 kubenswrapper[4762]: I1014 13:33:16.663435 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7c95684bcc-qn2dm" podStartSLOduration=3.4266341000000002 podStartE2EDuration="11.66340891s" podCreationTimestamp="2025-10-14 13:33:05 +0000 UTC" firstStartedPulling="2025-10-14 13:33:06.325318198 +0000 UTC m=+1615.569477367" lastFinishedPulling="2025-10-14 13:33:14.562093018 +0000 UTC m=+1623.806252177" observedRunningTime="2025-10-14 13:33:16.469668789 +0000 UTC m=+1625.713827958" watchObservedRunningTime="2025-10-14 13:33:16.66340891 +0000 UTC m=+1625.907568079" Oct 14 13:33:16.776266 master-2 kubenswrapper[4762]: I1014 13:33:16.775428 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-565dfd7bb9-c6fnn" podStartSLOduration=3.817094267 podStartE2EDuration="11.77540338s" podCreationTimestamp="2025-10-14 13:33:05 +0000 UTC" firstStartedPulling="2025-10-14 13:33:06.630260243 +0000 UTC m=+1615.874419402" lastFinishedPulling="2025-10-14 13:33:14.588569366 +0000 UTC m=+1623.832728515" observedRunningTime="2025-10-14 13:33:16.772736215 +0000 UTC m=+1626.016895384" watchObservedRunningTime="2025-10-14 13:33:16.77540338 +0000 UTC m=+1626.019562559" Oct 14 13:33:16.776870 master-2 kubenswrapper[4762]: I1014 13:33:16.776540 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-69958697d76f9td" Oct 14 13:33:16.822792 master-2 kubenswrapper[4762]: I1014 13:33:16.822699 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d78f57554-t6sj6" podStartSLOduration=3.509184657 podStartE2EDuration="11.822664348s" podCreationTimestamp="2025-10-14 13:33:05 +0000 UTC" firstStartedPulling="2025-10-14 13:33:06.241510031 +0000 UTC m=+1615.485669190" lastFinishedPulling="2025-10-14 13:33:14.554989722 +0000 UTC m=+1623.799148881" observedRunningTime="2025-10-14 13:33:16.816664647 +0000 UTC m=+1626.060823846" watchObservedRunningTime="2025-10-14 13:33:16.822664348 +0000 UTC m=+1626.066823507" Oct 14 13:33:27.953228 master-2 kubenswrapper[4762]: I1014 13:33:27.953083 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6566ff98d5-wbc89" Oct 14 13:33:52.409046 master-2 kubenswrapper[4762]: I1014 13:33:52.408979 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-2"] Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: E1014 13:33:52.409283 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="extract-content" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: I1014 13:33:52.409301 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="extract-content" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: E1014 13:33:52.409319 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="registry-server" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: I1014 13:33:52.409327 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="registry-server" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: E1014 13:33:52.409343 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="extract-utilities" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: I1014 13:33:52.409354 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="extract-utilities" Oct 14 13:33:52.409812 master-2 kubenswrapper[4762]: I1014 13:33:52.409493 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="25dcb722-c38a-4865-b817-c7f4b4711198" containerName="registry-server" Oct 14 13:33:52.410147 master-2 kubenswrapper[4762]: I1014 13:33:52.410116 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.412516 master-2 kubenswrapper[4762]: I1014 13:33:52.412452 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-p7d8w" Oct 14 13:33:52.420486 master-2 kubenswrapper[4762]: I1014 13:33:52.420424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-2"] Oct 14 13:33:52.555494 master-2 kubenswrapper[4762]: I1014 13:33:52.555309 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.555494 master-2 kubenswrapper[4762]: I1014 13:33:52.555366 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.657263 master-2 kubenswrapper[4762]: I1014 13:33:52.657034 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.657529 master-2 kubenswrapper[4762]: I1014 13:33:52.657284 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.657529 master-2 kubenswrapper[4762]: I1014 13:33:52.657384 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.682614 master-2 kubenswrapper[4762]: I1014 13:33:52.682566 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access\") pod \"revision-pruner-6-master-2\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:52.728585 master-2 kubenswrapper[4762]: I1014 13:33:52.728542 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:53.220215 master-2 kubenswrapper[4762]: I1014 13:33:53.220135 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-6-master-2"] Oct 14 13:33:53.871746 master-2 kubenswrapper[4762]: I1014 13:33:53.871643 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-master-2"] Oct 14 13:33:53.880086 master-2 kubenswrapper[4762]: I1014 13:33:53.880029 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-master-2"] Oct 14 13:33:54.046782 master-2 kubenswrapper[4762]: I1014 13:33:54.046685 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-2" event={"ID":"cfa61453-60eb-4581-a003-4d6d71f244cb","Type":"ContainerStarted","Data":"7c11d2ad76267e3b3a019d2a97cfb786d22b3faf12195361059cf0b7b4ee2341"} Oct 14 13:33:54.046782 master-2 kubenswrapper[4762]: I1014 13:33:54.046781 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-2" event={"ID":"cfa61453-60eb-4581-a003-4d6d71f244cb","Type":"ContainerStarted","Data":"325a93be2ba960a30cf029cf544d50927652a2eb24ba09a917fd61c0bb7c80d9"} Oct 14 13:33:54.067487 master-2 kubenswrapper[4762]: I1014 13:33:54.067391 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-6-master-2" podStartSLOduration=2.067373242 podStartE2EDuration="2.067373242s" podCreationTimestamp="2025-10-14 13:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:33:54.067016971 +0000 UTC m=+1663.311176130" watchObservedRunningTime="2025-10-14 13:33:54.067373242 +0000 UTC m=+1663.311532411" Oct 14 13:33:55.058082 master-2 kubenswrapper[4762]: I1014 13:33:55.057983 4762 generic.go:334] "Generic (PLEG): container finished" podID="cfa61453-60eb-4581-a003-4d6d71f244cb" containerID="7c11d2ad76267e3b3a019d2a97cfb786d22b3faf12195361059cf0b7b4ee2341" exitCode=0 Oct 14 13:33:55.058082 master-2 kubenswrapper[4762]: I1014 13:33:55.058066 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-2" event={"ID":"cfa61453-60eb-4581-a003-4d6d71f244cb","Type":"ContainerDied","Data":"7c11d2ad76267e3b3a019d2a97cfb786d22b3faf12195361059cf0b7b4ee2341"} Oct 14 13:33:55.557849 master-2 kubenswrapper[4762]: I1014 13:33:55.557782 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1479cd-b121-44d6-af25-3bc9b573c89f" path="/var/lib/kubelet/pods/8d1479cd-b121-44d6-af25-3bc9b573c89f/volumes" Oct 14 13:33:56.433956 master-2 kubenswrapper[4762]: I1014 13:33:56.433898 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:33:56.616867 master-2 kubenswrapper[4762]: I1014 13:33:56.616739 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access\") pod \"cfa61453-60eb-4581-a003-4d6d71f244cb\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " Oct 14 13:33:56.617335 master-2 kubenswrapper[4762]: I1014 13:33:56.616899 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir\") pod \"cfa61453-60eb-4581-a003-4d6d71f244cb\" (UID: \"cfa61453-60eb-4581-a003-4d6d71f244cb\") " Oct 14 13:33:56.617335 master-2 kubenswrapper[4762]: I1014 13:33:56.617094 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cfa61453-60eb-4581-a003-4d6d71f244cb" (UID: "cfa61453-60eb-4581-a003-4d6d71f244cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:33:56.617528 master-2 kubenswrapper[4762]: I1014 13:33:56.617487 4762 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa61453-60eb-4581-a003-4d6d71f244cb-kubelet-dir\") on node \"master-2\" DevicePath \"\"" Oct 14 13:33:56.621450 master-2 kubenswrapper[4762]: I1014 13:33:56.621289 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cfa61453-60eb-4581-a003-4d6d71f244cb" (UID: "cfa61453-60eb-4581-a003-4d6d71f244cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:33:56.718455 master-2 kubenswrapper[4762]: I1014 13:33:56.718388 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa61453-60eb-4581-a003-4d6d71f244cb-kube-api-access\") on node \"master-2\" DevicePath \"\"" Oct 14 13:33:57.076593 master-2 kubenswrapper[4762]: I1014 13:33:57.076509 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-6-master-2" event={"ID":"cfa61453-60eb-4581-a003-4d6d71f244cb","Type":"ContainerDied","Data":"325a93be2ba960a30cf029cf544d50927652a2eb24ba09a917fd61c0bb7c80d9"} Oct 14 13:33:57.076593 master-2 kubenswrapper[4762]: I1014 13:33:57.076568 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325a93be2ba960a30cf029cf544d50927652a2eb24ba09a917fd61c0bb7c80d9" Oct 14 13:33:57.076593 master-2 kubenswrapper[4762]: I1014 13:33:57.076593 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-6-master-2" Oct 14 13:34:09.137761 master-2 kubenswrapper[4762]: I1014 13:34:09.132354 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:09.137761 master-2 kubenswrapper[4762]: E1014 13:34:09.132679 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa61453-60eb-4581-a003-4d6d71f244cb" containerName="pruner" Oct 14 13:34:09.137761 master-2 kubenswrapper[4762]: I1014 13:34:09.132695 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa61453-60eb-4581-a003-4d6d71f244cb" containerName="pruner" Oct 14 13:34:09.137761 master-2 kubenswrapper[4762]: I1014 13:34:09.132807 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa61453-60eb-4581-a003-4d6d71f244cb" containerName="pruner" Oct 14 13:34:09.137761 master-2 kubenswrapper[4762]: I1014 13:34:09.133740 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.139980 master-2 kubenswrapper[4762]: I1014 13:34:09.139945 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 14 13:34:09.140051 master-2 kubenswrapper[4762]: I1014 13:34:09.139914 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 14 13:34:09.140085 master-2 kubenswrapper[4762]: I1014 13:34:09.140040 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 14 13:34:09.140215 master-2 kubenswrapper[4762]: I1014 13:34:09.139955 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 14 13:34:09.153618 master-2 kubenswrapper[4762]: I1014 13:34:09.153573 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:09.205948 master-2 kubenswrapper[4762]: I1014 13:34:09.205874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qq5\" (UniqueName: \"kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.205948 master-2 kubenswrapper[4762]: I1014 13:34:09.205930 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.206213 master-2 kubenswrapper[4762]: I1014 13:34:09.206015 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.307563 master-2 kubenswrapper[4762]: I1014 13:34:09.307398 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qq5\" (UniqueName: \"kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.307563 master-2 kubenswrapper[4762]: I1014 13:34:09.307492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.307805 master-2 kubenswrapper[4762]: I1014 13:34:09.307569 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.309043 master-2 kubenswrapper[4762]: I1014 13:34:09.308984 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.309614 master-2 kubenswrapper[4762]: I1014 13:34:09.309568 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.338493 master-2 kubenswrapper[4762]: I1014 13:34:09.338442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qq5\" (UniqueName: \"kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5\") pod \"dnsmasq-dns-66d4475947-8hmrn\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.339929 master-2 kubenswrapper[4762]: I1014 13:34:09.339853 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:09.344933 master-2 kubenswrapper[4762]: I1014 13:34:09.344887 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.358058 master-2 kubenswrapper[4762]: I1014 13:34:09.357991 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:09.465888 master-2 kubenswrapper[4762]: I1014 13:34:09.465846 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:09.512091 master-2 kubenswrapper[4762]: I1014 13:34:09.511899 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9khts\" (UniqueName: \"kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.512091 master-2 kubenswrapper[4762]: I1014 13:34:09.511970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.512091 master-2 kubenswrapper[4762]: I1014 13:34:09.512004 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.614346 master-2 kubenswrapper[4762]: I1014 13:34:09.613365 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.614346 master-2 kubenswrapper[4762]: I1014 13:34:09.613499 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9khts\" (UniqueName: \"kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.614346 master-2 kubenswrapper[4762]: I1014 13:34:09.613543 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.614346 master-2 kubenswrapper[4762]: I1014 13:34:09.614137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.614346 master-2 kubenswrapper[4762]: I1014 13:34:09.614242 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.638772 master-2 kubenswrapper[4762]: I1014 13:34:09.638708 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9khts\" (UniqueName: \"kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts\") pod \"redhat-operators-jc2hh\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.697509 master-2 kubenswrapper[4762]: I1014 13:34:09.697378 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:09.907886 master-2 kubenswrapper[4762]: W1014 13:34:09.907818 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb1782f_23bf_4ded_a9c7_d6a3993ec24f.slice/crio-38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c WatchSource:0}: Error finding container 38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c: Status 404 returned error can't find the container with id 38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c Oct 14 13:34:09.909541 master-2 kubenswrapper[4762]: I1014 13:34:09.908861 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:34:10.015529 master-2 kubenswrapper[4762]: I1014 13:34:10.015444 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:10.126615 master-2 kubenswrapper[4762]: I1014 13:34:10.126556 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:10.127869 master-2 kubenswrapper[4762]: W1014 13:34:10.127792 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e95b136_ddd5_429b_a1b1_585d64385ae0.slice/crio-317f77e039ed406f05e66bd4350d53e76248cc3447e6237bba447515cc44739e WatchSource:0}: Error finding container 317f77e039ed406f05e66bd4350d53e76248cc3447e6237bba447515cc44739e: Status 404 returned error can't find the container with id 317f77e039ed406f05e66bd4350d53e76248cc3447e6237bba447515cc44739e Oct 14 13:34:10.180878 master-2 kubenswrapper[4762]: I1014 13:34:10.180690 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" event={"ID":"deb1782f-23bf-4ded-a9c7-d6a3993ec24f","Type":"ContainerStarted","Data":"38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c"} Oct 14 13:34:10.183784 master-2 kubenswrapper[4762]: I1014 13:34:10.183078 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerStarted","Data":"317f77e039ed406f05e66bd4350d53e76248cc3447e6237bba447515cc44739e"} Oct 14 13:34:11.231244 master-2 kubenswrapper[4762]: I1014 13:34:11.230681 4762 generic.go:334] "Generic (PLEG): container finished" podID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerID="7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e" exitCode=0 Oct 14 13:34:11.231244 master-2 kubenswrapper[4762]: I1014 13:34:11.230741 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerDied","Data":"7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e"} Oct 14 13:34:12.242256 master-2 kubenswrapper[4762]: I1014 13:34:12.242185 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerStarted","Data":"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a"} Oct 14 13:34:13.252615 master-2 kubenswrapper[4762]: I1014 13:34:13.252556 4762 generic.go:334] "Generic (PLEG): container finished" podID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerID="ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a" exitCode=0 Oct 14 13:34:13.252615 master-2 kubenswrapper[4762]: I1014 13:34:13.252601 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerDied","Data":"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a"} Oct 14 13:34:13.544715 master-2 kubenswrapper[4762]: I1014 13:34:13.544534 4762 scope.go:117] "RemoveContainer" containerID="97643b9390c296873cc51961bb7ec70a80fba35c5e5c0df557ef033fe557b704" Oct 14 13:34:21.723765 master-2 kubenswrapper[4762]: I1014 13:34:21.723688 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 14 13:34:21.726486 master-2 kubenswrapper[4762]: I1014 13:34:21.726448 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.729403 master-2 kubenswrapper[4762]: I1014 13:34:21.729349 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 14 13:34:21.729480 master-2 kubenswrapper[4762]: I1014 13:34:21.729439 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 14 13:34:21.730366 master-2 kubenswrapper[4762]: I1014 13:34:21.730324 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 14 13:34:21.917073 master-2 kubenswrapper[4762]: I1014 13:34:21.916977 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxkd\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-kube-api-access-fnxkd\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.917390 master-2 kubenswrapper[4762]: I1014 13:34:21.917286 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.917539 master-2 kubenswrapper[4762]: I1014 13:34:21.917503 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.917638 master-2 kubenswrapper[4762]: I1014 13:34:21.917605 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.917716 master-2 kubenswrapper[4762]: I1014 13:34:21.917669 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:21.917769 master-2 kubenswrapper[4762]: I1014 13:34:21.917748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019353 master-2 kubenswrapper[4762]: I1014 13:34:22.019145 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019353 master-2 kubenswrapper[4762]: I1014 13:34:22.019315 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019717 master-2 kubenswrapper[4762]: I1014 13:34:22.019366 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019717 master-2 kubenswrapper[4762]: I1014 13:34:22.019404 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019717 master-2 kubenswrapper[4762]: I1014 13:34:22.019449 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.019717 master-2 kubenswrapper[4762]: I1014 13:34:22.019508 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxkd\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-kube-api-access-fnxkd\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.020501 master-2 kubenswrapper[4762]: I1014 13:34:22.020419 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.023528 master-2 kubenswrapper[4762]: I1014 13:34:22.023465 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-web-config\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.024350 master-2 kubenswrapper[4762]: I1014 13:34:22.024279 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-out\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.026667 master-2 kubenswrapper[4762]: I1014 13:34:22.026579 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-tls-assets\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.027087 master-2 kubenswrapper[4762]: I1014 13:34:22.026745 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4485526-8eb4-41e6-a1bc-68754346b4f1-config-volume\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.044910 master-2 kubenswrapper[4762]: I1014 13:34:22.044801 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxkd\" (UniqueName: \"kubernetes.io/projected/d4485526-8eb4-41e6-a1bc-68754346b4f1-kube-api-access-fnxkd\") pod \"alertmanager-metric-storage-1\" (UID: \"d4485526-8eb4-41e6-a1bc-68754346b4f1\") " pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.054650 master-2 kubenswrapper[4762]: I1014 13:34:22.054550 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:22.063363 master-2 kubenswrapper[4762]: I1014 13:34:22.063119 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 14 13:34:22.950094 master-2 kubenswrapper[4762]: I1014 13:34:22.949989 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Oct 14 13:34:22.952029 master-2 kubenswrapper[4762]: I1014 13:34:22.951980 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Oct 14 13:34:22.956756 master-2 kubenswrapper[4762]: I1014 13:34:22.956025 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 14 13:34:22.956756 master-2 kubenswrapper[4762]: I1014 13:34:22.956356 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 14 13:34:22.956756 master-2 kubenswrapper[4762]: I1014 13:34:22.956687 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 14 13:34:22.957198 master-2 kubenswrapper[4762]: I1014 13:34:22.957126 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 14 13:34:22.957589 master-2 kubenswrapper[4762]: I1014 13:34:22.957546 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 14 13:34:22.958939 master-2 kubenswrapper[4762]: I1014 13:34:22.957752 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 14 13:34:23.135119 master-2 kubenswrapper[4762]: I1014 13:34:23.134940 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65011ee0-036e-4a85-9ca7-182d37e3345c-pod-info\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.135119 master-2 kubenswrapper[4762]: I1014 13:34:23.135028 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65011ee0-036e-4a85-9ca7-182d37e3345c-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.135736 master-2 kubenswrapper[4762]: I1014 13:34:23.135283 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9g4z\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-kube-api-access-c9g4z\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.135736 master-2 kubenswrapper[4762]: I1014 13:34:23.135444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.135878 master-2 kubenswrapper[4762]: I1014 13:34:23.135768 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.135878 master-2 kubenswrapper[4762]: I1014 13:34:23.135813 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-server-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.136020 master-2 kubenswrapper[4762]: I1014 13:34:23.135888 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.136088 master-2 kubenswrapper[4762]: I1014 13:34:23.136026 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f8aafb7-ddfd-40fe-8f59-341aeb92a7ad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0bae2f13-c5c1-468a-bc5e-fa3e08565df5\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.136182 master-2 kubenswrapper[4762]: I1014 13:34:23.136083 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-config-data\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.136276 master-2 kubenswrapper[4762]: I1014 13:34:23.136227 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.136353 master-2 kubenswrapper[4762]: I1014 13:34:23.136321 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238259 master-2 kubenswrapper[4762]: I1014 13:34:23.238130 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9g4z\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-kube-api-access-c9g4z\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238259 master-2 kubenswrapper[4762]: I1014 13:34:23.238261 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238332 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238366 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-server-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238410 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238457 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f8aafb7-ddfd-40fe-8f59-341aeb92a7ad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0bae2f13-c5c1-468a-bc5e-fa3e08565df5\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238489 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-config-data\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238544 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238597 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238630 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65011ee0-036e-4a85-9ca7-182d37e3345c-pod-info\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.238852 master-2 kubenswrapper[4762]: I1014 13:34:23.238672 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65011ee0-036e-4a85-9ca7-182d37e3345c-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.240728 master-2 kubenswrapper[4762]: I1014 13:34:23.240657 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.240924 master-2 kubenswrapper[4762]: I1014 13:34:23.240775 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.242308 master-2 kubenswrapper[4762]: I1014 13:34:23.242263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-config-data\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.242519 master-2 kubenswrapper[4762]: I1014 13:34:23.242393 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:23.242682 master-2 kubenswrapper[4762]: I1014 13:34:23.242630 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.242811 master-2 kubenswrapper[4762]: I1014 13:34:23.242778 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f8aafb7-ddfd-40fe-8f59-341aeb92a7ad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0bae2f13-c5c1-468a-bc5e-fa3e08565df5\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f1325f37bc9a951d3365bc909b990b6d5d9daaa0733bd501af8f64eb45f0e779/globalmount\"" pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.243092 master-2 kubenswrapper[4762]: I1014 13:34:23.242853 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65011ee0-036e-4a85-9ca7-182d37e3345c-server-conf\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.243397 master-2 kubenswrapper[4762]: I1014 13:34:23.243363 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.243583 master-2 kubenswrapper[4762]: I1014 13:34:23.243373 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65011ee0-036e-4a85-9ca7-182d37e3345c-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.245742 master-2 kubenswrapper[4762]: I1014 13:34:23.245670 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.247633 master-2 kubenswrapper[4762]: I1014 13:34:23.247570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65011ee0-036e-4a85-9ca7-182d37e3345c-pod-info\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.265178 master-2 kubenswrapper[4762]: I1014 13:34:23.265080 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9g4z\" (UniqueName: \"kubernetes.io/projected/65011ee0-036e-4a85-9ca7-182d37e3345c-kube-api-access-c9g4z\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:23.371312 master-2 kubenswrapper[4762]: I1014 13:34:23.371207 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Oct 14 13:34:24.688662 master-2 kubenswrapper[4762]: I1014 13:34:24.688534 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-1"] Oct 14 13:34:24.732119 master-2 kubenswrapper[4762]: I1014 13:34:24.732042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f8aafb7-ddfd-40fe-8f59-341aeb92a7ad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0bae2f13-c5c1-468a-bc5e-fa3e08565df5\") pod \"rabbitmq-server-1\" (UID: \"65011ee0-036e-4a85-9ca7-182d37e3345c\") " pod="openstack/rabbitmq-server-1" Oct 14 13:34:25.341243 master-2 kubenswrapper[4762]: I1014 13:34:25.341128 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"d4485526-8eb4-41e6-a1bc-68754346b4f1","Type":"ContainerStarted","Data":"bda45eadae1023b25e9b3c2c795e8230bc86229b0d2cb712f373b8a3fd209c43"} Oct 14 13:34:25.409683 master-2 kubenswrapper[4762]: I1014 13:34:25.409607 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Oct 14 13:34:28.369441 master-2 kubenswrapper[4762]: I1014 13:34:28.369387 4762 generic.go:334] "Generic (PLEG): container finished" podID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerID="569ac4a1fbc0cbb11c98554206d23ee3a7f6d6efc1925bd457e101f8e2b592ba" exitCode=0 Oct 14 13:34:28.370472 master-2 kubenswrapper[4762]: I1014 13:34:28.369474 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" event={"ID":"deb1782f-23bf-4ded-a9c7-d6a3993ec24f","Type":"ContainerDied","Data":"569ac4a1fbc0cbb11c98554206d23ee3a7f6d6efc1925bd457e101f8e2b592ba"} Oct 14 13:34:28.372904 master-2 kubenswrapper[4762]: I1014 13:34:28.372857 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerStarted","Data":"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a"} Oct 14 13:34:28.625220 master-2 kubenswrapper[4762]: E1014 13:34:28.625104 4762 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 14 13:34:28.625220 master-2 kubenswrapper[4762]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/deb1782f-23bf-4ded-a9c7-d6a3993ec24f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 14 13:34:28.625220 master-2 kubenswrapper[4762]: > podSandboxID="38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c" Oct 14 13:34:28.625619 master-2 kubenswrapper[4762]: E1014 13:34:28.625457 4762 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 14 13:34:28.625619 master-2 kubenswrapper[4762]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:673685cea6ea2dbd78bcb555955c1b9f05ea26018f79ee34494256a5f2d7b74a,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58fh579h64dh56ch657h674h656h9fh547h5hf7hc6h557hfdh566h66fh69h5cdhfh59fh58ch678h587h68ch675h6ch559h5f4h549h5f7h56fh586q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6qq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000790000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-66d4475947-8hmrn_openstack(deb1782f-23bf-4ded-a9c7-d6a3993ec24f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/deb1782f-23bf-4ded-a9c7-d6a3993ec24f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 14 13:34:28.625619 master-2 kubenswrapper[4762]: > logger="UnhandledError" Oct 14 13:34:28.626997 master-2 kubenswrapper[4762]: E1014 13:34:28.626935 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/deb1782f-23bf-4ded-a9c7-d6a3993ec24f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" Oct 14 13:34:28.899141 master-2 kubenswrapper[4762]: I1014 13:34:28.898890 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Oct 14 13:34:28.971644 master-2 kubenswrapper[4762]: I1014 13:34:28.971570 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jc2hh" podStartSLOduration=4.261017567 podStartE2EDuration="19.971545422s" podCreationTimestamp="2025-10-14 13:34:09 +0000 UTC" firstStartedPulling="2025-10-14 13:34:11.232666656 +0000 UTC m=+1680.476825815" lastFinishedPulling="2025-10-14 13:34:26.943194511 +0000 UTC m=+1696.187353670" observedRunningTime="2025-10-14 13:34:28.918141059 +0000 UTC m=+1698.162300228" watchObservedRunningTime="2025-10-14 13:34:28.971545422 +0000 UTC m=+1698.215704581" Oct 14 13:34:29.381435 master-2 kubenswrapper[4762]: I1014 13:34:29.381374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"65011ee0-036e-4a85-9ca7-182d37e3345c","Type":"ContainerStarted","Data":"f33b2cc31dba872e572ee09a231dc36513b2a32be73a60694318aa62c9a6570e"} Oct 14 13:34:29.697889 master-2 kubenswrapper[4762]: I1014 13:34:29.697809 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:29.698184 master-2 kubenswrapper[4762]: I1014 13:34:29.698109 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:29.796542 master-2 kubenswrapper[4762]: I1014 13:34:29.796001 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6w2cz"] Oct 14 13:34:29.797241 master-2 kubenswrapper[4762]: I1014 13:34:29.797217 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.800932 master-2 kubenswrapper[4762]: I1014 13:34:29.800898 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 14 13:34:29.801041 master-2 kubenswrapper[4762]: I1014 13:34:29.800926 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 14 13:34:29.810915 master-2 kubenswrapper[4762]: I1014 13:34:29.810825 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 14 13:34:29.820616 master-2 kubenswrapper[4762]: I1014 13:34:29.820568 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6w2cz"] Oct 14 13:34:29.848352 master-2 kubenswrapper[4762]: I1014 13:34:29.848304 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rxkr2"] Oct 14 13:34:29.852750 master-2 kubenswrapper[4762]: I1014 13:34:29.850315 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.893528 master-2 kubenswrapper[4762]: I1014 13:34:29.893470 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rxkr2"] Oct 14 13:34:29.985430 master-2 kubenswrapper[4762]: I1014 13:34:29.985385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s2pv\" (UniqueName: \"kubernetes.io/projected/a7994a21-3973-432f-aedd-48a87c96530e-kube-api-access-2s2pv\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985437 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlj2\" (UniqueName: \"kubernetes.io/projected/4ab0fa0c-8873-41ab-b534-7c4c71350245-kube-api-access-5xlj2\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985662 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-log\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985696 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-ovn-controller-tls-certs\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985714 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-etc-ovs\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985730 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-lib\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-log-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985777 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab0fa0c-8873-41ab-b534-7c4c71350245-scripts\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985805 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-run\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7994a21-3973-432f-aedd-48a87c96530e-scripts\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985875 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-combined-ca-bundle\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:29.986134 master-2 kubenswrapper[4762]: I1014 13:34:29.985917 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.089597 master-2 kubenswrapper[4762]: I1014 13:34:30.089330 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.089597 master-2 kubenswrapper[4762]: I1014 13:34:30.089496 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s2pv\" (UniqueName: \"kubernetes.io/projected/a7994a21-3973-432f-aedd-48a87c96530e-kube-api-access-2s2pv\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.089597 master-2 kubenswrapper[4762]: I1014 13:34:30.089534 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlj2\" (UniqueName: \"kubernetes.io/projected/4ab0fa0c-8873-41ab-b534-7c4c71350245-kube-api-access-5xlj2\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.089597 master-2 kubenswrapper[4762]: I1014 13:34:30.089601 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-log\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089627 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-ovn-controller-tls-certs\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-etc-ovs\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089675 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089699 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-lib\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089721 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-log-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089739 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab0fa0c-8873-41ab-b534-7c4c71350245-scripts\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-run\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089854 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7994a21-3973-432f-aedd-48a87c96530e-scripts\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-combined-ca-bundle\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.089962 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090074 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-log\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090093 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-etc-ovs\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-log-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab0fa0c-8873-41ab-b534-7c4c71350245-var-run-ovn\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090268 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-run\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.091773 master-2 kubenswrapper[4762]: I1014 13:34:30.090337 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a7994a21-3973-432f-aedd-48a87c96530e-var-lib\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.092705 master-2 kubenswrapper[4762]: I1014 13:34:30.092674 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7994a21-3973-432f-aedd-48a87c96530e-scripts\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.096840 master-2 kubenswrapper[4762]: I1014 13:34:30.095290 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab0fa0c-8873-41ab-b534-7c4c71350245-scripts\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.096840 master-2 kubenswrapper[4762]: I1014 13:34:30.095321 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-ovn-controller-tls-certs\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.096840 master-2 kubenswrapper[4762]: I1014 13:34:30.096044 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab0fa0c-8873-41ab-b534-7c4c71350245-combined-ca-bundle\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.193810 master-2 kubenswrapper[4762]: I1014 13:34:30.193755 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlj2\" (UniqueName: \"kubernetes.io/projected/4ab0fa0c-8873-41ab-b534-7c4c71350245-kube-api-access-5xlj2\") pod \"ovn-controller-6w2cz\" (UID: \"4ab0fa0c-8873-41ab-b534-7c4c71350245\") " pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.198224 master-2 kubenswrapper[4762]: I1014 13:34:30.198187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s2pv\" (UniqueName: \"kubernetes.io/projected/a7994a21-3973-432f-aedd-48a87c96530e-kube-api-access-2s2pv\") pod \"ovn-controller-ovs-rxkr2\" (UID: \"a7994a21-3973-432f-aedd-48a87c96530e\") " pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.395378 master-2 kubenswrapper[4762]: I1014 13:34:30.394079 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" event={"ID":"deb1782f-23bf-4ded-a9c7-d6a3993ec24f","Type":"ContainerStarted","Data":"51174e895781f8bc8dadb64db2e2c511ded172b768a6bfc879192d0cc3e94725"} Oct 14 13:34:30.395378 master-2 kubenswrapper[4762]: I1014 13:34:30.394890 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:30.412771 master-2 kubenswrapper[4762]: I1014 13:34:30.412638 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:30.482254 master-2 kubenswrapper[4762]: I1014 13:34:30.477257 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:30.646264 master-2 kubenswrapper[4762]: I1014 13:34:30.646112 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" podStartSLOduration=4.612286458 podStartE2EDuration="21.646092635s" podCreationTimestamp="2025-10-14 13:34:09 +0000 UTC" firstStartedPulling="2025-10-14 13:34:09.908771655 +0000 UTC m=+1679.152930814" lastFinishedPulling="2025-10-14 13:34:26.942577832 +0000 UTC m=+1696.186736991" observedRunningTime="2025-10-14 13:34:30.619194869 +0000 UTC m=+1699.863354058" watchObservedRunningTime="2025-10-14 13:34:30.646092635 +0000 UTC m=+1699.890251794" Oct 14 13:34:30.751475 master-2 kubenswrapper[4762]: I1014 13:34:30.751403 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jc2hh" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="registry-server" probeResult="failure" output=< Oct 14 13:34:30.751475 master-2 kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Oct 14 13:34:30.751475 master-2 kubenswrapper[4762]: > Oct 14 13:34:31.029108 master-2 kubenswrapper[4762]: I1014 13:34:31.029059 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6w2cz"] Oct 14 13:34:32.409241 master-2 kubenswrapper[4762]: I1014 13:34:32.409076 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6w2cz" event={"ID":"4ab0fa0c-8873-41ab-b534-7c4c71350245","Type":"ContainerStarted","Data":"24719bc27b27f7eaa3dfbbe19643e04fba83336ae23563db2d67f2c66fa5dee7"} Oct 14 13:34:32.449902 master-2 kubenswrapper[4762]: I1014 13:34:32.449845 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:32.450289 master-2 kubenswrapper[4762]: I1014 13:34:32.450255 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="dnsmasq-dns" containerID="cri-o://51174e895781f8bc8dadb64db2e2c511ded172b768a6bfc879192d0cc3e94725" gracePeriod=10 Oct 14 13:34:32.571230 master-2 kubenswrapper[4762]: I1014 13:34:32.571140 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8hkq7"] Oct 14 13:34:32.572961 master-2 kubenswrapper[4762]: I1014 13:34:32.572932 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.575382 master-2 kubenswrapper[4762]: I1014 13:34:32.575347 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 14 13:34:32.576630 master-2 kubenswrapper[4762]: I1014 13:34:32.575628 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 14 13:34:32.633286 master-2 kubenswrapper[4762]: I1014 13:34:32.632844 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8hkq7"] Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.731443 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p488h\" (UniqueName: \"kubernetes.io/projected/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-kube-api-access-p488h\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.731553 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovs-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.731645 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.731737 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-combined-ca-bundle\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.732858 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-config\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.733212 master-2 kubenswrapper[4762]: I1014 13:34:32.732988 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovn-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.835706 master-2 kubenswrapper[4762]: I1014 13:34:32.835643 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-config\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835745 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovn-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835790 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p488h\" (UniqueName: \"kubernetes.io/projected/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-kube-api-access-p488h\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835828 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovs-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835877 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835916 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-combined-ca-bundle\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835996 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovs-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836117 master-2 kubenswrapper[4762]: I1014 13:34:32.835976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-ovn-rundir\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.836604 master-2 kubenswrapper[4762]: I1014 13:34:32.836565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-config\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.839082 master-2 kubenswrapper[4762]: I1014 13:34:32.839043 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.839310 master-2 kubenswrapper[4762]: I1014 13:34:32.839270 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-combined-ca-bundle\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.856688 master-2 kubenswrapper[4762]: I1014 13:34:32.856622 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p488h\" (UniqueName: \"kubernetes.io/projected/e2c09d4c-6a39-497d-9c99-fb7fdcf450fe-kube-api-access-p488h\") pod \"ovn-controller-metrics-8hkq7\" (UID: \"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe\") " pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:32.890538 master-2 kubenswrapper[4762]: I1014 13:34:32.890480 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8hkq7" Oct 14 13:34:33.102904 master-2 kubenswrapper[4762]: I1014 13:34:33.097261 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 14 13:34:33.102904 master-2 kubenswrapper[4762]: I1014 13:34:33.098764 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.102904 master-2 kubenswrapper[4762]: I1014 13:34:33.101668 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 14 13:34:33.102904 master-2 kubenswrapper[4762]: I1014 13:34:33.101740 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 14 13:34:33.102904 master-2 kubenswrapper[4762]: I1014 13:34:33.101748 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 14 13:34:33.104743 master-2 kubenswrapper[4762]: I1014 13:34:33.104438 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 14 13:34:33.104743 master-2 kubenswrapper[4762]: I1014 13:34:33.104503 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 14 13:34:33.105026 master-2 kubenswrapper[4762]: I1014 13:34:33.104978 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 14 13:34:33.112546 master-2 kubenswrapper[4762]: I1014 13:34:33.112474 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 14 13:34:33.242276 master-2 kubenswrapper[4762]: I1014 13:34:33.242136 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242533 master-2 kubenswrapper[4762]: I1014 13:34:33.242330 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242533 master-2 kubenswrapper[4762]: I1014 13:34:33.242377 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-86e49310-1c02-4771-8ec7-301d582c781e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e81362a3-4692-4eda-996e-808a6c5052c6\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242533 master-2 kubenswrapper[4762]: I1014 13:34:33.242409 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242533 master-2 kubenswrapper[4762]: I1014 13:34:33.242477 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242756 master-2 kubenswrapper[4762]: I1014 13:34:33.242682 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242756 master-2 kubenswrapper[4762]: I1014 13:34:33.242745 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242870 master-2 kubenswrapper[4762]: I1014 13:34:33.242793 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.242870 master-2 kubenswrapper[4762]: I1014 13:34:33.242855 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.243043 master-2 kubenswrapper[4762]: I1014 13:34:33.243012 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.243202 master-2 kubenswrapper[4762]: I1014 13:34:33.243177 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rjj\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-kube-api-access-28rjj\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344676 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344735 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344759 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344838 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rjj\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-kube-api-access-28rjj\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344925 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344954 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.344980 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-86e49310-1c02-4771-8ec7-301d582c781e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e81362a3-4692-4eda-996e-808a6c5052c6\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.345003 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.346456 master-2 kubenswrapper[4762]: I1014 13:34:33.345036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.348670 master-2 kubenswrapper[4762]: I1014 13:34:33.346104 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.348670 master-2 kubenswrapper[4762]: I1014 13:34:33.348520 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.349318 master-2 kubenswrapper[4762]: I1014 13:34:33.348859 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.349781 master-2 kubenswrapper[4762]: I1014 13:34:33.349655 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:33.349781 master-2 kubenswrapper[4762]: I1014 13:34:33.349700 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-86e49310-1c02-4771-8ec7-301d582c781e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e81362a3-4692-4eda-996e-808a6c5052c6\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/8d031d5cfd60e75c683041784ca973b362557fc57ab0adc155ed169494ef182e/globalmount\"" pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.349781 master-2 kubenswrapper[4762]: I1014 13:34:33.349733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.350083 master-2 kubenswrapper[4762]: I1014 13:34:33.350042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.350558 master-2 kubenswrapper[4762]: I1014 13:34:33.350515 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.353300 master-2 kubenswrapper[4762]: I1014 13:34:33.353238 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.353450 master-2 kubenswrapper[4762]: I1014 13:34:33.353262 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.356953 master-2 kubenswrapper[4762]: I1014 13:34:33.356906 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.366313 master-2 kubenswrapper[4762]: I1014 13:34:33.366255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rjj\" (UniqueName: \"kubernetes.io/projected/94b8bfba-acf4-46d4-ad15-183dafcb7bd0-kube-api-access-28rjj\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:33.417442 master-2 kubenswrapper[4762]: I1014 13:34:33.417381 4762 generic.go:334] "Generic (PLEG): container finished" podID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerID="51174e895781f8bc8dadb64db2e2c511ded172b768a6bfc879192d0cc3e94725" exitCode=0 Oct 14 13:34:33.417442 master-2 kubenswrapper[4762]: I1014 13:34:33.417441 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" event={"ID":"deb1782f-23bf-4ded-a9c7-d6a3993ec24f","Type":"ContainerDied","Data":"51174e895781f8bc8dadb64db2e2c511ded172b768a6bfc879192d0cc3e94725"} Oct 14 13:34:34.467668 master-2 kubenswrapper[4762]: I1014 13:34:34.467546 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.129.0.107:5353: connect: connection refused" Oct 14 13:34:34.826909 master-2 kubenswrapper[4762]: I1014 13:34:34.826811 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-86e49310-1c02-4771-8ec7-301d582c781e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e81362a3-4692-4eda-996e-808a6c5052c6\") pod \"rabbitmq-cell1-server-1\" (UID: \"94b8bfba-acf4-46d4-ad15-183dafcb7bd0\") " pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:34.921672 master-2 kubenswrapper[4762]: I1014 13:34:34.921413 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:34:35.085316 master-2 kubenswrapper[4762]: I1014 13:34:35.085270 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:35.197030 master-2 kubenswrapper[4762]: I1014 13:34:35.196973 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc\") pod \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " Oct 14 13:34:35.197891 master-2 kubenswrapper[4762]: I1014 13:34:35.197067 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config\") pod \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " Oct 14 13:34:35.197891 master-2 kubenswrapper[4762]: I1014 13:34:35.197137 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6qq5\" (UniqueName: \"kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5\") pod \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\" (UID: \"deb1782f-23bf-4ded-a9c7-d6a3993ec24f\") " Oct 14 13:34:35.201015 master-2 kubenswrapper[4762]: I1014 13:34:35.200393 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5" (OuterVolumeSpecName: "kube-api-access-c6qq5") pod "deb1782f-23bf-4ded-a9c7-d6a3993ec24f" (UID: "deb1782f-23bf-4ded-a9c7-d6a3993ec24f"). InnerVolumeSpecName "kube-api-access-c6qq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:34:35.226042 master-2 kubenswrapper[4762]: I1014 13:34:35.225982 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config" (OuterVolumeSpecName: "config") pod "deb1782f-23bf-4ded-a9c7-d6a3993ec24f" (UID: "deb1782f-23bf-4ded-a9c7-d6a3993ec24f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:34:35.255042 master-2 kubenswrapper[4762]: I1014 13:34:35.251819 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "deb1782f-23bf-4ded-a9c7-d6a3993ec24f" (UID: "deb1782f-23bf-4ded-a9c7-d6a3993ec24f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:34:35.301237 master-2 kubenswrapper[4762]: I1014 13:34:35.299395 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6qq5\" (UniqueName: \"kubernetes.io/projected/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-kube-api-access-c6qq5\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:35.301237 master-2 kubenswrapper[4762]: I1014 13:34:35.299441 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:35.301237 master-2 kubenswrapper[4762]: I1014 13:34:35.299454 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb1782f-23bf-4ded-a9c7-d6a3993ec24f-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:35.380533 master-2 kubenswrapper[4762]: I1014 13:34:35.380493 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rxkr2"] Oct 14 13:34:35.385115 master-2 kubenswrapper[4762]: W1014 13:34:35.385080 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7994a21_3973_432f_aedd_48a87c96530e.slice/crio-e1e02a59ba3ff9a5389d36264a6fef8665383a4658d6cdd9827da7076bdb0512 WatchSource:0}: Error finding container e1e02a59ba3ff9a5389d36264a6fef8665383a4658d6cdd9827da7076bdb0512: Status 404 returned error can't find the container with id e1e02a59ba3ff9a5389d36264a6fef8665383a4658d6cdd9827da7076bdb0512 Oct 14 13:34:35.439270 master-2 kubenswrapper[4762]: I1014 13:34:35.439207 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" event={"ID":"deb1782f-23bf-4ded-a9c7-d6a3993ec24f","Type":"ContainerDied","Data":"38d9c237c3abc31b44e0a119e713c5d927c4d7048916148c132bc0b3337f9d2c"} Oct 14 13:34:35.439270 master-2 kubenswrapper[4762]: I1014 13:34:35.439280 4762 scope.go:117] "RemoveContainer" containerID="51174e895781f8bc8dadb64db2e2c511ded172b768a6bfc879192d0cc3e94725" Oct 14 13:34:35.439545 master-2 kubenswrapper[4762]: I1014 13:34:35.439410 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d4475947-8hmrn" Oct 14 13:34:35.454681 master-2 kubenswrapper[4762]: I1014 13:34:35.454617 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rxkr2" event={"ID":"a7994a21-3973-432f-aedd-48a87c96530e","Type":"ContainerStarted","Data":"e1e02a59ba3ff9a5389d36264a6fef8665383a4658d6cdd9827da7076bdb0512"} Oct 14 13:34:35.470052 master-2 kubenswrapper[4762]: I1014 13:34:35.469983 4762 scope.go:117] "RemoveContainer" containerID="569ac4a1fbc0cbb11c98554206d23ee3a7f6d6efc1925bd457e101f8e2b592ba" Oct 14 13:34:35.496332 master-2 kubenswrapper[4762]: I1014 13:34:35.495014 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:35.496332 master-2 kubenswrapper[4762]: I1014 13:34:35.495774 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d4475947-8hmrn"] Oct 14 13:34:35.557633 master-2 kubenswrapper[4762]: I1014 13:34:35.557525 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" path="/var/lib/kubelet/pods/deb1782f-23bf-4ded-a9c7-d6a3993ec24f/volumes" Oct 14 13:34:35.588573 master-2 kubenswrapper[4762]: I1014 13:34:35.588523 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8hkq7"] Oct 14 13:34:35.775385 master-2 kubenswrapper[4762]: I1014 13:34:35.775328 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-1"] Oct 14 13:34:35.876875 master-2 kubenswrapper[4762]: W1014 13:34:35.876745 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b8bfba_acf4_46d4_ad15_183dafcb7bd0.slice/crio-0f274c701bef10657475c0d06c94359f5a06cb3c0f7f78856a187ae5fa2dafa5 WatchSource:0}: Error finding container 0f274c701bef10657475c0d06c94359f5a06cb3c0f7f78856a187ae5fa2dafa5: Status 404 returned error can't find the container with id 0f274c701bef10657475c0d06c94359f5a06cb3c0f7f78856a187ae5fa2dafa5 Oct 14 13:34:36.463733 master-2 kubenswrapper[4762]: I1014 13:34:36.463683 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"94b8bfba-acf4-46d4-ad15-183dafcb7bd0","Type":"ContainerStarted","Data":"0f274c701bef10657475c0d06c94359f5a06cb3c0f7f78856a187ae5fa2dafa5"} Oct 14 13:34:36.465778 master-2 kubenswrapper[4762]: I1014 13:34:36.465735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"65011ee0-036e-4a85-9ca7-182d37e3345c","Type":"ContainerStarted","Data":"4a75a2a2b6b819dcff060b183202fa9cf17cb3a010b483415f435e581c9870a1"} Oct 14 13:34:36.469824 master-2 kubenswrapper[4762]: I1014 13:34:36.469791 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8hkq7" event={"ID":"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe","Type":"ContainerStarted","Data":"8ca2ee659a205d9abfaa5154b84191707e1fd0f108ba4450f7e3e1f156a86568"} Oct 14 13:34:37.478310 master-2 kubenswrapper[4762]: I1014 13:34:37.478122 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"94b8bfba-acf4-46d4-ad15-183dafcb7bd0","Type":"ContainerStarted","Data":"69d787333de2bb70ad9086fc935dad769b039a6fd2cc173356499236881b2cbe"} Oct 14 13:34:37.483422 master-2 kubenswrapper[4762]: I1014 13:34:37.483356 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"d4485526-8eb4-41e6-a1bc-68754346b4f1","Type":"ContainerStarted","Data":"7b8c8548b478d80039a280943988aa4196e77e730c107ebf544ac81ecab85440"} Oct 14 13:34:39.404082 master-2 kubenswrapper[4762]: I1014 13:34:39.404028 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-1"] Oct 14 13:34:39.404576 master-2 kubenswrapper[4762]: E1014 13:34:39.404342 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="init" Oct 14 13:34:39.404576 master-2 kubenswrapper[4762]: I1014 13:34:39.404359 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="init" Oct 14 13:34:39.404576 master-2 kubenswrapper[4762]: E1014 13:34:39.404369 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="dnsmasq-dns" Oct 14 13:34:39.404576 master-2 kubenswrapper[4762]: I1014 13:34:39.404375 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="dnsmasq-dns" Oct 14 13:34:39.404576 master-2 kubenswrapper[4762]: I1014 13:34:39.404513 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb1782f-23bf-4ded-a9c7-d6a3993ec24f" containerName="dnsmasq-dns" Oct 14 13:34:39.405592 master-2 kubenswrapper[4762]: I1014 13:34:39.405562 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-1" Oct 14 13:34:39.408836 master-2 kubenswrapper[4762]: I1014 13:34:39.408803 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 14 13:34:39.409694 master-2 kubenswrapper[4762]: I1014 13:34:39.409458 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 14 13:34:39.409694 master-2 kubenswrapper[4762]: I1014 13:34:39.409575 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 14 13:34:39.409966 master-2 kubenswrapper[4762]: I1014 13:34:39.409873 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 14 13:34:39.422807 master-2 kubenswrapper[4762]: I1014 13:34:39.422743 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-1"] Oct 14 13:34:39.599528 master-2 kubenswrapper[4762]: I1014 13:34:39.599400 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-989ed34a-8795-4c67-8ff7-e9e690400b3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d21c64e9-1a3a-44d8-9d18-5962fe42eb7e\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599528 master-2 kubenswrapper[4762]: I1014 13:34:39.599503 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599754 master-2 kubenswrapper[4762]: I1014 13:34:39.599600 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599754 master-2 kubenswrapper[4762]: I1014 13:34:39.599650 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599754 master-2 kubenswrapper[4762]: I1014 13:34:39.599677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-kolla-config\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599754 master-2 kubenswrapper[4762]: I1014 13:34:39.599711 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qg7\" (UniqueName: \"kubernetes.io/projected/b91e22ed-012b-43db-bd24-7634ae6d22a9-kube-api-access-s8qg7\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599754 master-2 kubenswrapper[4762]: I1014 13:34:39.599738 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-default\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.599934 master-2 kubenswrapper[4762]: I1014 13:34:39.599774 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.601207 master-2 kubenswrapper[4762]: I1014 13:34:39.600190 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-secrets\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702605 master-2 kubenswrapper[4762]: I1014 13:34:39.702541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-kolla-config\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702605 master-2 kubenswrapper[4762]: I1014 13:34:39.702598 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qg7\" (UniqueName: \"kubernetes.io/projected/b91e22ed-012b-43db-bd24-7634ae6d22a9-kube-api-access-s8qg7\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702640 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-default\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702673 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702703 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-secrets\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702747 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-989ed34a-8795-4c67-8ff7-e9e690400b3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d21c64e9-1a3a-44d8-9d18-5962fe42eb7e\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.702861 master-2 kubenswrapper[4762]: I1014 13:34:39.702834 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.703180 master-2 kubenswrapper[4762]: I1014 13:34:39.702881 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.703555 master-2 kubenswrapper[4762]: I1014 13:34:39.703488 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-generated\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.703874 master-2 kubenswrapper[4762]: I1014 13:34:39.703818 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-kolla-config\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.704465 master-2 kubenswrapper[4762]: I1014 13:34:39.704384 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-config-data-default\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.704869 master-2 kubenswrapper[4762]: I1014 13:34:39.704832 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b91e22ed-012b-43db-bd24-7634ae6d22a9-operator-scripts\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.706082 master-2 kubenswrapper[4762]: I1014 13:34:39.706039 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.706431 master-2 kubenswrapper[4762]: I1014 13:34:39.706394 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:39.706473 master-2 kubenswrapper[4762]: I1014 13:34:39.706444 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-989ed34a-8795-4c67-8ff7-e9e690400b3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d21c64e9-1a3a-44d8-9d18-5962fe42eb7e\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/922ff494bbab9f44201937d6b2a8aa1b5d5cb774d66de6fea78227bee1c6a33b/globalmount\"" pod="openstack/openstack-galera-1" Oct 14 13:34:39.707316 master-2 kubenswrapper[4762]: I1014 13:34:39.707275 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-secrets\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.717595 master-2 kubenswrapper[4762]: I1014 13:34:39.717522 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91e22ed-012b-43db-bd24-7634ae6d22a9-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.733605 master-2 kubenswrapper[4762]: I1014 13:34:39.733558 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qg7\" (UniqueName: \"kubernetes.io/projected/b91e22ed-012b-43db-bd24-7634ae6d22a9-kube-api-access-s8qg7\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:39.744317 master-2 kubenswrapper[4762]: I1014 13:34:39.744244 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:39.795251 master-2 kubenswrapper[4762]: I1014 13:34:39.795192 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:39.999897 master-2 kubenswrapper[4762]: I1014 13:34:39.999826 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:40.508881 master-2 kubenswrapper[4762]: I1014 13:34:40.508807 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6w2cz" event={"ID":"4ab0fa0c-8873-41ab-b534-7c4c71350245","Type":"ContainerStarted","Data":"47599a821294d02bd8805e8c5cd7cd88204ab7c6521c6830e6a15f344d080c1f"} Oct 14 13:34:40.509453 master-2 kubenswrapper[4762]: I1014 13:34:40.508952 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6w2cz" Oct 14 13:34:40.511612 master-2 kubenswrapper[4762]: I1014 13:34:40.511560 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8hkq7" event={"ID":"e2c09d4c-6a39-497d-9c99-fb7fdcf450fe","Type":"ContainerStarted","Data":"7166cd09ccc686aa473d020427c85b73af0e74b8157bcaac3b76f5841e081952"} Oct 14 13:34:40.513494 master-2 kubenswrapper[4762]: I1014 13:34:40.513447 4762 generic.go:334] "Generic (PLEG): container finished" podID="a7994a21-3973-432f-aedd-48a87c96530e" containerID="b1bd11ba572952fd3b9e27926514aabb458db015067e8642ec2dd797866bb826" exitCode=0 Oct 14 13:34:40.513581 master-2 kubenswrapper[4762]: I1014 13:34:40.513550 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rxkr2" event={"ID":"a7994a21-3973-432f-aedd-48a87c96530e","Type":"ContainerDied","Data":"b1bd11ba572952fd3b9e27926514aabb458db015067e8642ec2dd797866bb826"} Oct 14 13:34:40.587598 master-2 kubenswrapper[4762]: I1014 13:34:40.587507 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8hkq7" podStartSLOduration=5.063857593 podStartE2EDuration="8.587482383s" podCreationTimestamp="2025-10-14 13:34:32 +0000 UTC" firstStartedPulling="2025-10-14 13:34:35.777450126 +0000 UTC m=+1705.021609285" lastFinishedPulling="2025-10-14 13:34:39.301074876 +0000 UTC m=+1708.545234075" observedRunningTime="2025-10-14 13:34:40.581049628 +0000 UTC m=+1709.825208827" watchObservedRunningTime="2025-10-14 13:34:40.587482383 +0000 UTC m=+1709.831641542" Oct 14 13:34:40.588449 master-2 kubenswrapper[4762]: I1014 13:34:40.588183 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6w2cz" podStartSLOduration=4.334425078 podStartE2EDuration="11.588175464s" podCreationTimestamp="2025-10-14 13:34:29 +0000 UTC" firstStartedPulling="2025-10-14 13:34:32.036950151 +0000 UTC m=+1701.281109310" lastFinishedPulling="2025-10-14 13:34:39.290700537 +0000 UTC m=+1708.534859696" observedRunningTime="2025-10-14 13:34:40.551983314 +0000 UTC m=+1709.796142483" watchObservedRunningTime="2025-10-14 13:34:40.588175464 +0000 UTC m=+1709.832334623" Oct 14 13:34:41.011183 master-2 kubenswrapper[4762]: I1014 13:34:41.010862 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-989ed34a-8795-4c67-8ff7-e9e690400b3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d21c64e9-1a3a-44d8-9d18-5962fe42eb7e\") pod \"openstack-galera-1\" (UID: \"b91e22ed-012b-43db-bd24-7634ae6d22a9\") " pod="openstack/openstack-galera-1" Oct 14 13:34:41.083909 master-2 kubenswrapper[4762]: I1014 13:34:41.083864 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-1" Oct 14 13:34:41.497759 master-2 kubenswrapper[4762]: W1014 13:34:41.497670 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91e22ed_012b_43db_bd24_7634ae6d22a9.slice/crio-3610c4ea0c3854812b394241f809bcd23f19b3cd7369de327292542fae348eba WatchSource:0}: Error finding container 3610c4ea0c3854812b394241f809bcd23f19b3cd7369de327292542fae348eba: Status 404 returned error can't find the container with id 3610c4ea0c3854812b394241f809bcd23f19b3cd7369de327292542fae348eba Oct 14 13:34:41.521072 master-2 kubenswrapper[4762]: I1014 13:34:41.521007 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-1"] Oct 14 13:34:41.523893 master-2 kubenswrapper[4762]: I1014 13:34:41.523840 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rxkr2" event={"ID":"a7994a21-3973-432f-aedd-48a87c96530e","Type":"ContainerStarted","Data":"d2ee03092f99af43bf4a877a339ad1946fcda9c25cf33a7cfdf50f5af045b404"} Oct 14 13:34:41.524496 master-2 kubenswrapper[4762]: I1014 13:34:41.524462 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rxkr2" event={"ID":"a7994a21-3973-432f-aedd-48a87c96530e","Type":"ContainerStarted","Data":"1398898522ef3c01dec0160618b03853a13d6ee8979a9919723a3c675d8bb668"} Oct 14 13:34:41.524560 master-2 kubenswrapper[4762]: I1014 13:34:41.524538 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:41.525291 master-2 kubenswrapper[4762]: I1014 13:34:41.525228 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"b91e22ed-012b-43db-bd24-7634ae6d22a9","Type":"ContainerStarted","Data":"3610c4ea0c3854812b394241f809bcd23f19b3cd7369de327292542fae348eba"} Oct 14 13:34:41.525653 master-2 kubenswrapper[4762]: I1014 13:34:41.525575 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jc2hh" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="registry-server" containerID="cri-o://44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a" gracePeriod=2 Oct 14 13:34:41.596890 master-2 kubenswrapper[4762]: I1014 13:34:41.596777 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rxkr2" podStartSLOduration=8.736755535 podStartE2EDuration="12.596754989s" podCreationTimestamp="2025-10-14 13:34:29 +0000 UTC" firstStartedPulling="2025-10-14 13:34:35.387868771 +0000 UTC m=+1704.632027930" lastFinishedPulling="2025-10-14 13:34:39.247868225 +0000 UTC m=+1708.492027384" observedRunningTime="2025-10-14 13:34:41.593583328 +0000 UTC m=+1710.837742517" watchObservedRunningTime="2025-10-14 13:34:41.596754989 +0000 UTC m=+1710.840914148" Oct 14 13:34:41.662048 master-2 kubenswrapper[4762]: I1014 13:34:41.661993 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 14 13:34:41.663107 master-2 kubenswrapper[4762]: I1014 13:34:41.663073 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.665594 master-2 kubenswrapper[4762]: I1014 13:34:41.665560 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 14 13:34:41.665750 master-2 kubenswrapper[4762]: I1014 13:34:41.665725 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 14 13:34:41.666043 master-2 kubenswrapper[4762]: I1014 13:34:41.666018 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 14 13:34:41.788497 master-2 kubenswrapper[4762]: I1014 13:34:41.788425 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 14 13:34:41.861441 master-2 kubenswrapper[4762]: I1014 13:34:41.861379 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861441 master-2 kubenswrapper[4762]: I1014 13:34:41.861423 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861441 master-2 kubenswrapper[4762]: I1014 13:34:41.861444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861481 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861511 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861537 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861556 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861614 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm242\" (UniqueName: \"kubernetes.io/projected/d97acc34-a72a-49c2-83b5-379d3946c591-kube-api-access-jm242\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.861719 master-2 kubenswrapper[4762]: I1014 13:34:41.861648 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3becd604-4a32-4caa-a1a5-bf33585edb2b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c07a4e0d-9dd8-457a-a3cc-e6d488aec92f\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963425 master-2 kubenswrapper[4762]: I1014 13:34:41.963338 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963609 master-2 kubenswrapper[4762]: I1014 13:34:41.963442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963609 master-2 kubenswrapper[4762]: I1014 13:34:41.963504 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963609 master-2 kubenswrapper[4762]: I1014 13:34:41.963539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963609 master-2 kubenswrapper[4762]: I1014 13:34:41.963593 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm242\" (UniqueName: \"kubernetes.io/projected/d97acc34-a72a-49c2-83b5-379d3946c591-kube-api-access-jm242\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963754 master-2 kubenswrapper[4762]: I1014 13:34:41.963672 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3becd604-4a32-4caa-a1a5-bf33585edb2b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c07a4e0d-9dd8-457a-a3cc-e6d488aec92f\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963754 master-2 kubenswrapper[4762]: I1014 13:34:41.963737 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963819 master-2 kubenswrapper[4762]: I1014 13:34:41.963774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.963848 master-2 kubenswrapper[4762]: I1014 13:34:41.963812 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.964492 master-2 kubenswrapper[4762]: I1014 13:34:41.964459 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.964684 master-2 kubenswrapper[4762]: I1014 13:34:41.964636 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.964801 master-2 kubenswrapper[4762]: I1014 13:34:41.964741 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.966136 master-2 kubenswrapper[4762]: I1014 13:34:41.966103 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:41.966712 master-2 kubenswrapper[4762]: I1014 13:34:41.966144 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3becd604-4a32-4caa-a1a5-bf33585edb2b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c07a4e0d-9dd8-457a-a3cc-e6d488aec92f\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/053050668e7aff2ab28638fd8e8e549acb0ac155620c5cf61217cd33ceb5176f/globalmount\"" pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.966985 master-2 kubenswrapper[4762]: I1014 13:34:41.966946 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97acc34-a72a-49c2-83b5-379d3946c591-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.967453 master-2 kubenswrapper[4762]: I1014 13:34:41.967419 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.967700 master-2 kubenswrapper[4762]: I1014 13:34:41.967655 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-secrets\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.970305 master-2 kubenswrapper[4762]: I1014 13:34:41.970257 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97acc34-a72a-49c2-83b5-379d3946c591-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:41.986046 master-2 kubenswrapper[4762]: I1014 13:34:41.985987 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm242\" (UniqueName: \"kubernetes.io/projected/d97acc34-a72a-49c2-83b5-379d3946c591-kube-api-access-jm242\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:42.012314 master-2 kubenswrapper[4762]: I1014 13:34:42.012204 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:42.166752 master-2 kubenswrapper[4762]: I1014 13:34:42.166695 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content\") pod \"4e95b136-ddd5-429b-a1b1-585d64385ae0\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " Oct 14 13:34:42.166939 master-2 kubenswrapper[4762]: I1014 13:34:42.166782 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities\") pod \"4e95b136-ddd5-429b-a1b1-585d64385ae0\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " Oct 14 13:34:42.166939 master-2 kubenswrapper[4762]: I1014 13:34:42.166820 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9khts\" (UniqueName: \"kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts\") pod \"4e95b136-ddd5-429b-a1b1-585d64385ae0\" (UID: \"4e95b136-ddd5-429b-a1b1-585d64385ae0\") " Oct 14 13:34:42.167593 master-2 kubenswrapper[4762]: I1014 13:34:42.167561 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities" (OuterVolumeSpecName: "utilities") pod "4e95b136-ddd5-429b-a1b1-585d64385ae0" (UID: "4e95b136-ddd5-429b-a1b1-585d64385ae0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:34:42.170197 master-2 kubenswrapper[4762]: I1014 13:34:42.170114 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts" (OuterVolumeSpecName: "kube-api-access-9khts") pod "4e95b136-ddd5-429b-a1b1-585d64385ae0" (UID: "4e95b136-ddd5-429b-a1b1-585d64385ae0"). InnerVolumeSpecName "kube-api-access-9khts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:34:42.258126 master-2 kubenswrapper[4762]: I1014 13:34:42.258047 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e95b136-ddd5-429b-a1b1-585d64385ae0" (UID: "4e95b136-ddd5-429b-a1b1-585d64385ae0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:34:42.268613 master-2 kubenswrapper[4762]: I1014 13:34:42.268527 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:42.268613 master-2 kubenswrapper[4762]: I1014 13:34:42.268571 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95b136-ddd5-429b-a1b1-585d64385ae0-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:42.268613 master-2 kubenswrapper[4762]: I1014 13:34:42.268585 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9khts\" (UniqueName: \"kubernetes.io/projected/4e95b136-ddd5-429b-a1b1-585d64385ae0-kube-api-access-9khts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.535083 4762 generic.go:334] "Generic (PLEG): container finished" podID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerID="44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a" exitCode=0 Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.535215 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jc2hh" Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.535213 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerDied","Data":"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a"} Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.535388 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jc2hh" event={"ID":"4e95b136-ddd5-429b-a1b1-585d64385ae0","Type":"ContainerDied","Data":"317f77e039ed406f05e66bd4350d53e76248cc3447e6237bba447515cc44739e"} Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.535446 4762 scope.go:117] "RemoveContainer" containerID="44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a" Oct 14 13:34:42.539459 master-2 kubenswrapper[4762]: I1014 13:34:42.536114 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:34:42.576355 master-2 kubenswrapper[4762]: I1014 13:34:42.575125 4762 scope.go:117] "RemoveContainer" containerID="ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a" Oct 14 13:34:42.590951 master-2 kubenswrapper[4762]: I1014 13:34:42.590826 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:42.602507 master-2 kubenswrapper[4762]: I1014 13:34:42.602269 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jc2hh"] Oct 14 13:34:42.608466 master-2 kubenswrapper[4762]: I1014 13:34:42.608425 4762 scope.go:117] "RemoveContainer" containerID="7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e" Oct 14 13:34:42.634029 master-2 kubenswrapper[4762]: I1014 13:34:42.633986 4762 scope.go:117] "RemoveContainer" containerID="44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a" Oct 14 13:34:42.634507 master-2 kubenswrapper[4762]: E1014 13:34:42.634468 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a\": container with ID starting with 44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a not found: ID does not exist" containerID="44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a" Oct 14 13:34:42.634575 master-2 kubenswrapper[4762]: I1014 13:34:42.634519 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a"} err="failed to get container status \"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a\": rpc error: code = NotFound desc = could not find container \"44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a\": container with ID starting with 44c3742b553cd9ea420f04010288cc7500aabad443c65b80592599b5f59c1b5a not found: ID does not exist" Oct 14 13:34:42.634575 master-2 kubenswrapper[4762]: I1014 13:34:42.634547 4762 scope.go:117] "RemoveContainer" containerID="ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a" Oct 14 13:34:42.635093 master-2 kubenswrapper[4762]: E1014 13:34:42.635041 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a\": container with ID starting with ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a not found: ID does not exist" containerID="ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a" Oct 14 13:34:42.635172 master-2 kubenswrapper[4762]: I1014 13:34:42.635092 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a"} err="failed to get container status \"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a\": rpc error: code = NotFound desc = could not find container \"ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a\": container with ID starting with ff9fa5e8c5e6baf2ebf596c95e8a821ff03f6032f7df38f2832b079bd17efc1a not found: ID does not exist" Oct 14 13:34:42.635172 master-2 kubenswrapper[4762]: I1014 13:34:42.635127 4762 scope.go:117] "RemoveContainer" containerID="7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e" Oct 14 13:34:42.635557 master-2 kubenswrapper[4762]: E1014 13:34:42.635518 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e\": container with ID starting with 7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e not found: ID does not exist" containerID="7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e" Oct 14 13:34:42.635620 master-2 kubenswrapper[4762]: I1014 13:34:42.635551 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e"} err="failed to get container status \"7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e\": rpc error: code = NotFound desc = could not find container \"7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e\": container with ID starting with 7779af00cd3dba96282ee49ba8d4a2df1669b3b8682c05866ceb4d2b7ef1991e not found: ID does not exist" Oct 14 13:34:43.116103 master-2 kubenswrapper[4762]: I1014 13:34:43.116057 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3becd604-4a32-4caa-a1a5-bf33585edb2b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c07a4e0d-9dd8-457a-a3cc-e6d488aec92f\") pod \"openstack-cell1-galera-2\" (UID: \"d97acc34-a72a-49c2-83b5-379d3946c591\") " pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:43.224738 master-2 kubenswrapper[4762]: I1014 13:34:43.224676 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:43.543359 master-2 kubenswrapper[4762]: I1014 13:34:43.543288 4762 generic.go:334] "Generic (PLEG): container finished" podID="d4485526-8eb4-41e6-a1bc-68754346b4f1" containerID="7b8c8548b478d80039a280943988aa4196e77e730c107ebf544ac81ecab85440" exitCode=0 Oct 14 13:34:43.543359 master-2 kubenswrapper[4762]: I1014 13:34:43.543362 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"d4485526-8eb4-41e6-a1bc-68754346b4f1","Type":"ContainerDied","Data":"7b8c8548b478d80039a280943988aa4196e77e730c107ebf544ac81ecab85440"} Oct 14 13:34:43.558480 master-2 kubenswrapper[4762]: I1014 13:34:43.558421 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" path="/var/lib/kubelet/pods/4e95b136-ddd5-429b-a1b1-585d64385ae0/volumes" Oct 14 13:34:43.646247 master-2 kubenswrapper[4762]: I1014 13:34:43.646202 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-2"] Oct 14 13:34:43.652793 master-2 kubenswrapper[4762]: W1014 13:34:43.652753 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd97acc34_a72a_49c2_83b5_379d3946c591.slice/crio-51b29f11d9116b37092fb1eaf2fe2c8bba6d980408000ec2c229ab83815fa25e WatchSource:0}: Error finding container 51b29f11d9116b37092fb1eaf2fe2c8bba6d980408000ec2c229ab83815fa25e: Status 404 returned error can't find the container with id 51b29f11d9116b37092fb1eaf2fe2c8bba6d980408000ec2c229ab83815fa25e Oct 14 13:34:44.558333 master-2 kubenswrapper[4762]: I1014 13:34:44.558272 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"d97acc34-a72a-49c2-83b5-379d3946c591","Type":"ContainerStarted","Data":"51b29f11d9116b37092fb1eaf2fe2c8bba6d980408000ec2c229ab83815fa25e"} Oct 14 13:34:46.579339 master-2 kubenswrapper[4762]: I1014 13:34:46.579243 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"d97acc34-a72a-49c2-83b5-379d3946c591","Type":"ContainerStarted","Data":"0cd5d05325a2a1e72a144105dd21080b3c047e91dd7ad3ba5e509d1915edb3a5"} Oct 14 13:34:46.582449 master-2 kubenswrapper[4762]: I1014 13:34:46.582355 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"b91e22ed-012b-43db-bd24-7634ae6d22a9","Type":"ContainerStarted","Data":"c9e5d62e4a67c7ef1757082e81f38af58ab743539c0d8d6fb3c899fd11f2a56d"} Oct 14 13:34:47.598446 master-2 kubenswrapper[4762]: I1014 13:34:47.598252 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"d4485526-8eb4-41e6-a1bc-68754346b4f1","Type":"ContainerStarted","Data":"75f4561f91789a004fa71f155d2b3b29bac1fd42f54712940162cc6599520232"} Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: I1014 13:34:47.619415 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: E1014 13:34:47.619819 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="extract-utilities" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: I1014 13:34:47.619838 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="extract-utilities" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: E1014 13:34:47.619893 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="extract-content" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: I1014 13:34:47.619905 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="extract-content" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: E1014 13:34:47.619922 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="registry-server" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: I1014 13:34:47.619935 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="registry-server" Oct 14 13:34:47.620784 master-2 kubenswrapper[4762]: I1014 13:34:47.620130 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e95b136-ddd5-429b-a1b1-585d64385ae0" containerName="registry-server" Oct 14 13:34:47.621474 master-2 kubenswrapper[4762]: I1014 13:34:47.621424 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:47.624700 master-2 kubenswrapper[4762]: I1014 13:34:47.624580 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 14 13:34:47.624894 master-2 kubenswrapper[4762]: I1014 13:34:47.624858 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 14 13:34:47.628807 master-2 kubenswrapper[4762]: I1014 13:34:47.628745 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 14 13:34:47.792637 master-2 kubenswrapper[4762]: I1014 13:34:47.792577 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 14 13:34:47.876837 master-2 kubenswrapper[4762]: I1014 13:34:47.876662 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-2"] Oct 14 13:34:47.877653 master-2 kubenswrapper[4762]: I1014 13:34:47.877612 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-2" Oct 14 13:34:47.880629 master-2 kubenswrapper[4762]: I1014 13:34:47.880586 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 14 13:34:47.881244 master-2 kubenswrapper[4762]: I1014 13:34:47.881195 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 14 13:34:48.035189 master-2 kubenswrapper[4762]: I1014 13:34:48.035104 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-2"] Oct 14 13:34:48.578118 master-2 kubenswrapper[4762]: I1014 13:34:48.577995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1c38eed-da70-41be-a3af-0115798752a4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.578118 master-2 kubenswrapper[4762]: I1014 13:34:48.578111 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kolla-config\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.578118 master-2 kubenswrapper[4762]: I1014 13:34:48.578134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-combined-ca-bundle\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.578763 master-2 kubenswrapper[4762]: I1014 13:34:48.578315 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-config-data\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.578763 master-2 kubenswrapper[4762]: I1014 13:34:48.578450 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.578763 master-2 kubenswrapper[4762]: I1014 13:34:48.578518 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kzk\" (UniqueName: \"kubernetes.io/projected/a1c38eed-da70-41be-a3af-0115798752a4-kube-api-access-44kzk\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.578763 master-2 kubenswrapper[4762]: I1014 13:34:48.578568 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dm7\" (UniqueName: \"kubernetes.io/projected/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kube-api-access-w8dm7\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.578763 master-2 kubenswrapper[4762]: I1014 13:34:48.578593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.579280 master-2 kubenswrapper[4762]: I1014 13:34:48.578769 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.579280 master-2 kubenswrapper[4762]: I1014 13:34:48.578952 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c11bfad7-a392-4fab-ae83-fc0e0bbde680\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9dab28f2-a5b1-49b7-a99d-cf1bc525d21a\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.579280 master-2 kubenswrapper[4762]: I1014 13:34:48.579115 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.579280 master-2 kubenswrapper[4762]: I1014 13:34:48.579265 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-memcached-tls-certs\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.579649 master-2 kubenswrapper[4762]: I1014 13:34:48.579368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.681412 master-2 kubenswrapper[4762]: I1014 13:34:48.681326 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.681412 master-2 kubenswrapper[4762]: I1014 13:34:48.681400 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dm7\" (UniqueName: \"kubernetes.io/projected/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kube-api-access-w8dm7\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681456 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681537 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681588 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-memcached-tls-certs\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681637 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1c38eed-da70-41be-a3af-0115798752a4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681752 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kolla-config\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.682397 master-2 kubenswrapper[4762]: I1014 13:34:48.681782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-combined-ca-bundle\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.682965 master-2 kubenswrapper[4762]: I1014 13:34:48.682517 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1c38eed-da70-41be-a3af-0115798752a4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682965 master-2 kubenswrapper[4762]: I1014 13:34:48.682899 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.682965 master-2 kubenswrapper[4762]: I1014 13:34:48.682932 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kolla-config\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.683340 master-2 kubenswrapper[4762]: I1014 13:34:48.683063 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-config-data\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.683340 master-2 kubenswrapper[4762]: I1014 13:34:48.683194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.683829 master-2 kubenswrapper[4762]: I1014 13:34:48.683778 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5d49fa-daf0-4081-bff5-b449c199a5e2-config-data\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.684730 master-2 kubenswrapper[4762]: I1014 13:34:48.684671 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1c38eed-da70-41be-a3af-0115798752a4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.684926 master-2 kubenswrapper[4762]: I1014 13:34:48.684875 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kzk\" (UniqueName: \"kubernetes.io/projected/a1c38eed-da70-41be-a3af-0115798752a4-kube-api-access-44kzk\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.688693 master-2 kubenswrapper[4762]: I1014 13:34:48.688615 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.691215 master-2 kubenswrapper[4762]: I1014 13:34:48.691127 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-combined-ca-bundle\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.699144 master-2 kubenswrapper[4762]: I1014 13:34:48.692976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.699144 master-2 kubenswrapper[4762]: I1014 13:34:48.694427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd5d49fa-daf0-4081-bff5-b449c199a5e2-memcached-tls-certs\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.699144 master-2 kubenswrapper[4762]: I1014 13:34:48.696455 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1c38eed-da70-41be-a3af-0115798752a4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.786832 master-2 kubenswrapper[4762]: I1014 13:34:48.786765 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c11bfad7-a392-4fab-ae83-fc0e0bbde680\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9dab28f2-a5b1-49b7-a99d-cf1bc525d21a\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.788832 master-2 kubenswrapper[4762]: I1014 13:34:48.788786 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:48.788832 master-2 kubenswrapper[4762]: I1014 13:34:48.788826 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c11bfad7-a392-4fab-ae83-fc0e0bbde680\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9dab28f2-a5b1-49b7-a99d-cf1bc525d21a\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/39ef519936fe31ce5a3da635ca5afa233ccc70ab1be794feb4248dae3416a8ed/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.793803 master-2 kubenswrapper[4762]: I1014 13:34:48.793742 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dm7\" (UniqueName: \"kubernetes.io/projected/fd5d49fa-daf0-4081-bff5-b449c199a5e2-kube-api-access-w8dm7\") pod \"memcached-2\" (UID: \"fd5d49fa-daf0-4081-bff5-b449c199a5e2\") " pod="openstack/memcached-2" Oct 14 13:34:48.795122 master-2 kubenswrapper[4762]: I1014 13:34:48.795060 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kzk\" (UniqueName: \"kubernetes.io/projected/a1c38eed-da70-41be-a3af-0115798752a4-kube-api-access-44kzk\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:48.799095 master-2 kubenswrapper[4762]: I1014 13:34:48.799045 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-2" Oct 14 13:34:49.280781 master-2 kubenswrapper[4762]: I1014 13:34:49.280716 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-2"] Oct 14 13:34:49.283626 master-2 kubenswrapper[4762]: W1014 13:34:49.283544 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd5d49fa_daf0_4081_bff5_b449c199a5e2.slice/crio-d7dcd3d5749b63bbffeeb32b670bda6c025d67b37582c0d06d6c1d758d021c31 WatchSource:0}: Error finding container d7dcd3d5749b63bbffeeb32b670bda6c025d67b37582c0d06d6c1d758d021c31: Status 404 returned error can't find the container with id d7dcd3d5749b63bbffeeb32b670bda6c025d67b37582c0d06d6c1d758d021c31 Oct 14 13:34:49.617804 master-2 kubenswrapper[4762]: I1014 13:34:49.615143 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-2" event={"ID":"fd5d49fa-daf0-4081-bff5-b449c199a5e2","Type":"ContainerStarted","Data":"d7dcd3d5749b63bbffeeb32b670bda6c025d67b37582c0d06d6c1d758d021c31"} Oct 14 13:34:50.248886 master-2 kubenswrapper[4762]: I1014 13:34:50.246944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c11bfad7-a392-4fab-ae83-fc0e0bbde680\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9dab28f2-a5b1-49b7-a99d-cf1bc525d21a\") pod \"ovsdbserver-nb-1\" (UID: \"a1c38eed-da70-41be-a3af-0115798752a4\") " pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:50.650196 master-2 kubenswrapper[4762]: I1014 13:34:50.646526 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-1" event={"ID":"d4485526-8eb4-41e6-a1bc-68754346b4f1","Type":"ContainerStarted","Data":"738e56ef091edd8827daa8d52dad0235de0790db111180dc354b98460accc42f"} Oct 14 13:34:50.650196 master-2 kubenswrapper[4762]: I1014 13:34:50.649011 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:50.650196 master-2 kubenswrapper[4762]: I1014 13:34:50.649904 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-1" Oct 14 13:34:51.358037 master-2 kubenswrapper[4762]: I1014 13:34:51.357819 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-1" podStartSLOduration=7.428938047 podStartE2EDuration="30.357795514s" podCreationTimestamp="2025-10-14 13:34:21 +0000 UTC" firstStartedPulling="2025-10-14 13:34:24.414873915 +0000 UTC m=+1693.659033074" lastFinishedPulling="2025-10-14 13:34:47.343731372 +0000 UTC m=+1716.587890541" observedRunningTime="2025-10-14 13:34:50.687892137 +0000 UTC m=+1719.932051326" watchObservedRunningTime="2025-10-14 13:34:51.357795514 +0000 UTC m=+1720.601954693" Oct 14 13:34:51.576860 master-2 kubenswrapper[4762]: I1014 13:34:51.576778 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 14 13:34:51.669293 master-2 kubenswrapper[4762]: I1014 13:34:51.669223 4762 generic.go:334] "Generic (PLEG): container finished" podID="d97acc34-a72a-49c2-83b5-379d3946c591" containerID="0cd5d05325a2a1e72a144105dd21080b3c047e91dd7ad3ba5e509d1915edb3a5" exitCode=0 Oct 14 13:34:51.669293 master-2 kubenswrapper[4762]: I1014 13:34:51.669301 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"d97acc34-a72a-49c2-83b5-379d3946c591","Type":"ContainerDied","Data":"0cd5d05325a2a1e72a144105dd21080b3c047e91dd7ad3ba5e509d1915edb3a5"} Oct 14 13:34:51.672093 master-2 kubenswrapper[4762]: I1014 13:34:51.672059 4762 generic.go:334] "Generic (PLEG): container finished" podID="b91e22ed-012b-43db-bd24-7634ae6d22a9" containerID="c9e5d62e4a67c7ef1757082e81f38af58ab743539c0d8d6fb3c899fd11f2a56d" exitCode=0 Oct 14 13:34:51.672718 master-2 kubenswrapper[4762]: I1014 13:34:51.672673 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"b91e22ed-012b-43db-bd24-7634ae6d22a9","Type":"ContainerDied","Data":"c9e5d62e4a67c7ef1757082e81f38af58ab743539c0d8d6fb3c899fd11f2a56d"} Oct 14 13:34:52.687997 master-2 kubenswrapper[4762]: I1014 13:34:52.687832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-2" event={"ID":"d97acc34-a72a-49c2-83b5-379d3946c591","Type":"ContainerStarted","Data":"0acc34979e6891c1e5a6e967ad9b2593236f0d7818ed46948fc1b0ffcbdd8d50"} Oct 14 13:34:52.692713 master-2 kubenswrapper[4762]: I1014 13:34:52.691603 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-1" event={"ID":"b91e22ed-012b-43db-bd24-7634ae6d22a9","Type":"ContainerStarted","Data":"36cd02d49010105adbbe09d8107becab773cb344189181aa737b671cdd832083"} Oct 14 13:34:53.225720 master-2 kubenswrapper[4762]: I1014 13:34:53.225621 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:53.225720 master-2 kubenswrapper[4762]: I1014 13:34:53.225705 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-2" Oct 14 13:34:53.636710 master-2 kubenswrapper[4762]: I1014 13:34:53.636538 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 14 13:34:53.638355 master-2 kubenswrapper[4762]: I1014 13:34:53.638317 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:53.641843 master-2 kubenswrapper[4762]: I1014 13:34:53.641784 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 14 13:34:53.642073 master-2 kubenswrapper[4762]: I1014 13:34:53.642016 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 14 13:34:53.643190 master-2 kubenswrapper[4762]: I1014 13:34:53.643167 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 14 13:34:53.697570 master-2 kubenswrapper[4762]: I1014 13:34:53.697443 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 14 13:34:54.592243 master-2 kubenswrapper[4762]: I1014 13:34:54.592124 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592511 master-2 kubenswrapper[4762]: I1014 13:34:54.592248 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592511 master-2 kubenswrapper[4762]: I1014 13:34:54.592327 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592511 master-2 kubenswrapper[4762]: I1014 13:34:54.592415 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ece6a90b-36ff-476a-a72d-5bcbaee948eb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8f8a3e72-08a9-40d6-abef-8deb36fa4805\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592638 master-2 kubenswrapper[4762]: I1014 13:34:54.592587 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7jvf\" (UniqueName: \"kubernetes.io/projected/9888e427-2968-459c-a41b-2ff9f687b867-kube-api-access-q7jvf\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592740 master-2 kubenswrapper[4762]: I1014 13:34:54.592714 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-config\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592914 master-2 kubenswrapper[4762]: I1014 13:34:54.592896 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.592965 master-2 kubenswrapper[4762]: I1014 13:34:54.592951 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9888e427-2968-459c-a41b-2ff9f687b867-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.694792 master-2 kubenswrapper[4762]: I1014 13:34:54.694708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.694792 master-2 kubenswrapper[4762]: I1014 13:34:54.694791 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9888e427-2968-459c-a41b-2ff9f687b867-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695138 master-2 kubenswrapper[4762]: I1014 13:34:54.694860 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695138 master-2 kubenswrapper[4762]: I1014 13:34:54.694891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695138 master-2 kubenswrapper[4762]: I1014 13:34:54.694946 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695138 master-2 kubenswrapper[4762]: I1014 13:34:54.695008 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7jvf\" (UniqueName: \"kubernetes.io/projected/9888e427-2968-459c-a41b-2ff9f687b867-kube-api-access-q7jvf\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695138 master-2 kubenswrapper[4762]: I1014 13:34:54.695037 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-config\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.695848 master-2 kubenswrapper[4762]: I1014 13:34:54.695792 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9888e427-2968-459c-a41b-2ff9f687b867-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.696742 master-2 kubenswrapper[4762]: I1014 13:34:54.696659 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-config\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.696742 master-2 kubenswrapper[4762]: I1014 13:34:54.696748 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9888e427-2968-459c-a41b-2ff9f687b867-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.699980 master-2 kubenswrapper[4762]: I1014 13:34:54.699006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.699980 master-2 kubenswrapper[4762]: I1014 13:34:54.699748 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.704439 master-2 kubenswrapper[4762]: I1014 13:34:54.704378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9888e427-2968-459c-a41b-2ff9f687b867-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.744647 master-2 kubenswrapper[4762]: I1014 13:34:54.743570 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7jvf\" (UniqueName: \"kubernetes.io/projected/9888e427-2968-459c-a41b-2ff9f687b867-kube-api-access-q7jvf\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.796571 master-2 kubenswrapper[4762]: I1014 13:34:54.796060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ece6a90b-36ff-476a-a72d-5bcbaee948eb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8f8a3e72-08a9-40d6-abef-8deb36fa4805\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:54.798474 master-2 kubenswrapper[4762]: I1014 13:34:54.798421 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:34:54.798552 master-2 kubenswrapper[4762]: I1014 13:34:54.798486 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ece6a90b-36ff-476a-a72d-5bcbaee948eb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8f8a3e72-08a9-40d6-abef-8deb36fa4805\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1ca0343e3d1faa392683466fd0be7bed955b5b75f9867e34525809f8eb8ed871/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:55.141391 master-2 kubenswrapper[4762]: I1014 13:34:55.141180 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-2" podStartSLOduration=38.886844015 podStartE2EDuration="41.141149641s" podCreationTimestamp="2025-10-14 13:34:14 +0000 UTC" firstStartedPulling="2025-10-14 13:34:43.656237973 +0000 UTC m=+1712.900397132" lastFinishedPulling="2025-10-14 13:34:45.910543599 +0000 UTC m=+1715.154702758" observedRunningTime="2025-10-14 13:34:55.138046822 +0000 UTC m=+1724.382205991" watchObservedRunningTime="2025-10-14 13:34:55.141149641 +0000 UTC m=+1724.385308800" Oct 14 13:34:55.490677 master-2 kubenswrapper[4762]: I1014 13:34:55.490563 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-1" podStartSLOduration=38.087279113 podStartE2EDuration="42.490541088s" podCreationTimestamp="2025-10-14 13:34:13 +0000 UTC" firstStartedPulling="2025-10-14 13:34:41.505407895 +0000 UTC m=+1710.749567054" lastFinishedPulling="2025-10-14 13:34:45.90866983 +0000 UTC m=+1715.152829029" observedRunningTime="2025-10-14 13:34:55.487582524 +0000 UTC m=+1724.731741763" watchObservedRunningTime="2025-10-14 13:34:55.490541088 +0000 UTC m=+1724.734700257" Oct 14 13:34:55.726818 master-2 kubenswrapper[4762]: I1014 13:34:55.726757 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-2" event={"ID":"fd5d49fa-daf0-4081-bff5-b449c199a5e2","Type":"ContainerStarted","Data":"2aef57910f39619c7982019511fce106ef7cfc8ac5d4e0b1aedce0f34f72b658"} Oct 14 13:34:55.727358 master-2 kubenswrapper[4762]: I1014 13:34:55.726999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-2" Oct 14 13:34:55.819566 master-2 kubenswrapper[4762]: I1014 13:34:55.819374 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-2" podStartSLOduration=3.249573132 podStartE2EDuration="8.819350412s" podCreationTimestamp="2025-10-14 13:34:47 +0000 UTC" firstStartedPulling="2025-10-14 13:34:49.286151104 +0000 UTC m=+1718.530310263" lastFinishedPulling="2025-10-14 13:34:54.855928384 +0000 UTC m=+1724.100087543" observedRunningTime="2025-10-14 13:34:55.819215828 +0000 UTC m=+1725.063375007" watchObservedRunningTime="2025-10-14 13:34:55.819350412 +0000 UTC m=+1725.063509581" Oct 14 13:34:56.083116 master-2 kubenswrapper[4762]: I1014 13:34:56.082965 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 14 13:34:56.096930 master-2 kubenswrapper[4762]: W1014 13:34:56.096852 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c38eed_da70_41be_a3af_0115798752a4.slice/crio-80f1bb550c55f20204767a8012c84cc6e6c5a5bdaf096a2d0de3464a0fff8af9 WatchSource:0}: Error finding container 80f1bb550c55f20204767a8012c84cc6e6c5a5bdaf096a2d0de3464a0fff8af9: Status 404 returned error can't find the container with id 80f1bb550c55f20204767a8012c84cc6e6c5a5bdaf096a2d0de3464a0fff8af9 Oct 14 13:34:56.328816 master-2 kubenswrapper[4762]: I1014 13:34:56.328762 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:34:56.329968 master-2 kubenswrapper[4762]: I1014 13:34:56.329942 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.332973 master-2 kubenswrapper[4762]: I1014 13:34:56.332933 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 14 13:34:56.333066 master-2 kubenswrapper[4762]: I1014 13:34:56.332987 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 14 13:34:56.333501 master-2 kubenswrapper[4762]: I1014 13:34:56.333439 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 14 13:34:56.361290 master-2 kubenswrapper[4762]: I1014 13:34:56.361238 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ece6a90b-36ff-476a-a72d-5bcbaee948eb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8f8a3e72-08a9-40d6-abef-8deb36fa4805\") pod \"ovsdbserver-sb-1\" (UID: \"9888e427-2968-459c-a41b-2ff9f687b867\") " pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:56.397338 master-2 kubenswrapper[4762]: I1014 13:34:56.397259 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:34:56.427108 master-2 kubenswrapper[4762]: I1014 13:34:56.426995 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.427400 master-2 kubenswrapper[4762]: I1014 13:34:56.427270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwtnm\" (UniqueName: \"kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.427568 master-2 kubenswrapper[4762]: I1014 13:34:56.427525 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.427649 master-2 kubenswrapper[4762]: I1014 13:34:56.427604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.529504 master-2 kubenswrapper[4762]: I1014 13:34:56.529422 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.529783 master-2 kubenswrapper[4762]: I1014 13:34:56.529526 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.529783 master-2 kubenswrapper[4762]: I1014 13:34:56.529638 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwtnm\" (UniqueName: \"kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.529783 master-2 kubenswrapper[4762]: I1014 13:34:56.529767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.530637 master-2 kubenswrapper[4762]: I1014 13:34:56.530577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.530807 master-2 kubenswrapper[4762]: I1014 13:34:56.530730 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.531084 master-2 kubenswrapper[4762]: I1014 13:34:56.531019 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.558347 master-2 kubenswrapper[4762]: I1014 13:34:56.558286 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwtnm\" (UniqueName: \"kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm\") pod \"dnsmasq-dns-d999b9745-tmsff\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.655694 master-2 kubenswrapper[4762]: I1014 13:34:56.655534 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 14 13:34:56.665307 master-2 kubenswrapper[4762]: I1014 13:34:56.665250 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:56.738041 master-2 kubenswrapper[4762]: I1014 13:34:56.737969 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1c38eed-da70-41be-a3af-0115798752a4","Type":"ContainerStarted","Data":"80f1bb550c55f20204767a8012c84cc6e6c5a5bdaf096a2d0de3464a0fff8af9"} Oct 14 13:34:57.191832 master-2 kubenswrapper[4762]: I1014 13:34:57.191768 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:34:57.233645 master-2 kubenswrapper[4762]: I1014 13:34:57.233571 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 14 13:34:57.239271 master-2 kubenswrapper[4762]: W1014 13:34:57.239092 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9888e427_2968_459c_a41b_2ff9f687b867.slice/crio-723d42f161a2f1e48855bc0166ded8ed2dd14e6add864d1d8f13154c655b3cea WatchSource:0}: Error finding container 723d42f161a2f1e48855bc0166ded8ed2dd14e6add864d1d8f13154c655b3cea: Status 404 returned error can't find the container with id 723d42f161a2f1e48855bc0166ded8ed2dd14e6add864d1d8f13154c655b3cea Oct 14 13:34:57.749924 master-2 kubenswrapper[4762]: I1014 13:34:57.749642 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9888e427-2968-459c-a41b-2ff9f687b867","Type":"ContainerStarted","Data":"723d42f161a2f1e48855bc0166ded8ed2dd14e6add864d1d8f13154c655b3cea"} Oct 14 13:34:57.751941 master-2 kubenswrapper[4762]: I1014 13:34:57.751898 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1c38eed-da70-41be-a3af-0115798752a4","Type":"ContainerStarted","Data":"1e5d0aa94df23fdd7c029c8ed597510f82c3397a7e29e877be65c36205ac839a"} Oct 14 13:34:57.752004 master-2 kubenswrapper[4762]: I1014 13:34:57.751956 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1c38eed-da70-41be-a3af-0115798752a4","Type":"ContainerStarted","Data":"e836d2ff9a7915dfaa27858a76a4fb7ca1f7107b259aeef5b0bf2a34cf4cd465"} Oct 14 13:34:57.753485 master-2 kubenswrapper[4762]: I1014 13:34:57.753445 4762 generic.go:334] "Generic (PLEG): container finished" podID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerID="678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753" exitCode=0 Oct 14 13:34:57.753553 master-2 kubenswrapper[4762]: I1014 13:34:57.753494 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d999b9745-tmsff" event={"ID":"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e","Type":"ContainerDied","Data":"678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753"} Oct 14 13:34:57.753553 master-2 kubenswrapper[4762]: I1014 13:34:57.753522 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d999b9745-tmsff" event={"ID":"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e","Type":"ContainerStarted","Data":"bc72419c0f79c3ab88e4eedc933458a5a9e2de8d8b1e5290a5c6c3d2b228c2f0"} Oct 14 13:34:57.784412 master-2 kubenswrapper[4762]: I1014 13:34:57.784339 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=27.695214495 podStartE2EDuration="28.784322548s" podCreationTimestamp="2025-10-14 13:34:29 +0000 UTC" firstStartedPulling="2025-10-14 13:34:56.100720617 +0000 UTC m=+1725.344879776" lastFinishedPulling="2025-10-14 13:34:57.18982867 +0000 UTC m=+1726.433987829" observedRunningTime="2025-10-14 13:34:57.781832719 +0000 UTC m=+1727.025991898" watchObservedRunningTime="2025-10-14 13:34:57.784322548 +0000 UTC m=+1727.028481707" Oct 14 13:34:58.763825 master-2 kubenswrapper[4762]: I1014 13:34:58.763747 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9888e427-2968-459c-a41b-2ff9f687b867","Type":"ContainerStarted","Data":"f0174dbd5749880f3b77f89de93fd25c19126e12e814043c37f55bfd214cd8ef"} Oct 14 13:34:58.763825 master-2 kubenswrapper[4762]: I1014 13:34:58.763807 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9888e427-2968-459c-a41b-2ff9f687b867","Type":"ContainerStarted","Data":"596ecd728f038748bf71cfc2d4058beefa387ce9ac1811dacc548e56de82955c"} Oct 14 13:34:58.765912 master-2 kubenswrapper[4762]: I1014 13:34:58.765852 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d999b9745-tmsff" event={"ID":"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e","Type":"ContainerStarted","Data":"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e"} Oct 14 13:34:58.766282 master-2 kubenswrapper[4762]: I1014 13:34:58.766229 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:34:58.799871 master-2 kubenswrapper[4762]: I1014 13:34:58.799776 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=27.845535435 podStartE2EDuration="28.799753309s" podCreationTimestamp="2025-10-14 13:34:30 +0000 UTC" firstStartedPulling="2025-10-14 13:34:57.249654452 +0000 UTC m=+1726.493813611" lastFinishedPulling="2025-10-14 13:34:58.203872336 +0000 UTC m=+1727.448031485" observedRunningTime="2025-10-14 13:34:58.796464505 +0000 UTC m=+1728.040623674" watchObservedRunningTime="2025-10-14 13:34:58.799753309 +0000 UTC m=+1728.043912478" Oct 14 13:34:58.831166 master-2 kubenswrapper[4762]: I1014 13:34:58.831049 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d999b9745-tmsff" podStartSLOduration=2.831017593 podStartE2EDuration="2.831017593s" podCreationTimestamp="2025-10-14 13:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:34:58.824158196 +0000 UTC m=+1728.068317365" watchObservedRunningTime="2025-10-14 13:34:58.831017593 +0000 UTC m=+1728.075176762" Oct 14 13:34:59.655785 master-2 kubenswrapper[4762]: I1014 13:34:59.655671 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 14 13:35:00.577091 master-2 kubenswrapper[4762]: I1014 13:35:00.576999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 14 13:35:00.615006 master-2 kubenswrapper[4762]: I1014 13:35:00.614908 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 14 13:35:00.781362 master-2 kubenswrapper[4762]: I1014 13:35:00.781280 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 14 13:35:01.084788 master-2 kubenswrapper[4762]: I1014 13:35:01.084700 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-1" Oct 14 13:35:01.084788 master-2 kubenswrapper[4762]: I1014 13:35:01.084763 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-1" Oct 14 13:35:01.656019 master-2 kubenswrapper[4762]: I1014 13:35:01.655933 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 14 13:35:02.692959 master-2 kubenswrapper[4762]: I1014 13:35:02.692873 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 14 13:35:03.608223 master-2 kubenswrapper[4762]: I1014 13:35:03.608129 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:03.612562 master-2 kubenswrapper[4762]: I1014 13:35:03.612487 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.636604 master-2 kubenswrapper[4762]: I1014 13:35:03.636552 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:03.749193 master-2 kubenswrapper[4762]: I1014 13:35:03.745191 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvfm8\" (UniqueName: \"kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.749193 master-2 kubenswrapper[4762]: I1014 13:35:03.745301 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.749193 master-2 kubenswrapper[4762]: I1014 13:35:03.745358 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.801183 master-2 kubenswrapper[4762]: I1014 13:35:03.800328 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-2" Oct 14 13:35:03.863250 master-2 kubenswrapper[4762]: I1014 13:35:03.863102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvfm8\" (UniqueName: \"kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.863448 master-2 kubenswrapper[4762]: I1014 13:35:03.863324 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.867180 master-2 kubenswrapper[4762]: I1014 13:35:03.863481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.867180 master-2 kubenswrapper[4762]: I1014 13:35:03.864284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.882189 master-2 kubenswrapper[4762]: I1014 13:35:03.873022 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.905466 master-2 kubenswrapper[4762]: I1014 13:35:03.905396 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvfm8\" (UniqueName: \"kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8\") pod \"certified-operators-w9zcz\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:03.934641 master-2 kubenswrapper[4762]: I1014 13:35:03.934567 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:04.399000 master-2 kubenswrapper[4762]: I1014 13:35:04.398955 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:04.403081 master-2 kubenswrapper[4762]: W1014 13:35:04.403021 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7bdf8a6_ca85_4786_8051_4cf476517541.slice/crio-88943ef2a5e3a79fe9307d4c5eb8b3f6670721e222ac5825caa81d29cbc7a2df WatchSource:0}: Error finding container 88943ef2a5e3a79fe9307d4c5eb8b3f6670721e222ac5825caa81d29cbc7a2df: Status 404 returned error can't find the container with id 88943ef2a5e3a79fe9307d4c5eb8b3f6670721e222ac5825caa81d29cbc7a2df Oct 14 13:35:04.850332 master-2 kubenswrapper[4762]: I1014 13:35:04.850200 4762 generic.go:334] "Generic (PLEG): container finished" podID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerID="3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4" exitCode=0 Oct 14 13:35:04.850332 master-2 kubenswrapper[4762]: I1014 13:35:04.850251 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerDied","Data":"3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4"} Oct 14 13:35:04.850332 master-2 kubenswrapper[4762]: I1014 13:35:04.850281 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerStarted","Data":"88943ef2a5e3a79fe9307d4c5eb8b3f6670721e222ac5825caa81d29cbc7a2df"} Oct 14 13:35:05.339952 master-2 kubenswrapper[4762]: I1014 13:35:05.339894 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-2" Oct 14 13:35:05.386941 master-2 kubenswrapper[4762]: I1014 13:35:05.386894 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-2" Oct 14 13:35:06.635626 master-2 kubenswrapper[4762]: I1014 13:35:06.635576 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 14 13:35:06.667457 master-2 kubenswrapper[4762]: I1014 13:35:06.667402 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:35:06.694583 master-2 kubenswrapper[4762]: I1014 13:35:06.694531 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 14 13:35:07.876722 master-2 kubenswrapper[4762]: I1014 13:35:07.875704 4762 generic.go:334] "Generic (PLEG): container finished" podID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerID="cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a" exitCode=0 Oct 14 13:35:07.876722 master-2 kubenswrapper[4762]: I1014 13:35:07.875768 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerDied","Data":"cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a"} Oct 14 13:35:08.890300 master-2 kubenswrapper[4762]: I1014 13:35:08.887958 4762 generic.go:334] "Generic (PLEG): container finished" podID="65011ee0-036e-4a85-9ca7-182d37e3345c" containerID="4a75a2a2b6b819dcff060b183202fa9cf17cb3a010b483415f435e581c9870a1" exitCode=0 Oct 14 13:35:08.890300 master-2 kubenswrapper[4762]: I1014 13:35:08.888026 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"65011ee0-036e-4a85-9ca7-182d37e3345c","Type":"ContainerDied","Data":"4a75a2a2b6b819dcff060b183202fa9cf17cb3a010b483415f435e581c9870a1"} Oct 14 13:35:08.892192 master-2 kubenswrapper[4762]: I1014 13:35:08.891781 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerStarted","Data":"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5"} Oct 14 13:35:08.980265 master-2 kubenswrapper[4762]: I1014 13:35:08.980176 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9zcz" podStartSLOduration=2.550665849 podStartE2EDuration="5.980131105s" podCreationTimestamp="2025-10-14 13:35:03 +0000 UTC" firstStartedPulling="2025-10-14 13:35:04.851919265 +0000 UTC m=+1734.096078424" lastFinishedPulling="2025-10-14 13:35:08.281384521 +0000 UTC m=+1737.525543680" observedRunningTime="2025-10-14 13:35:08.971722578 +0000 UTC m=+1738.215881757" watchObservedRunningTime="2025-10-14 13:35:08.980131105 +0000 UTC m=+1738.224290264" Oct 14 13:35:09.901886 master-2 kubenswrapper[4762]: I1014 13:35:09.901547 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"65011ee0-036e-4a85-9ca7-182d37e3345c","Type":"ContainerStarted","Data":"cce7e24591653bc6b6add61c0734bf9b2794093eff2f1e0d660b8bead320338d"} Oct 14 13:35:09.902546 master-2 kubenswrapper[4762]: I1014 13:35:09.901953 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Oct 14 13:35:09.903975 master-2 kubenswrapper[4762]: I1014 13:35:09.903937 4762 generic.go:334] "Generic (PLEG): container finished" podID="94b8bfba-acf4-46d4-ad15-183dafcb7bd0" containerID="69d787333de2bb70ad9086fc935dad769b039a6fd2cc173356499236881b2cbe" exitCode=0 Oct 14 13:35:09.904096 master-2 kubenswrapper[4762]: I1014 13:35:09.903984 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"94b8bfba-acf4-46d4-ad15-183dafcb7bd0","Type":"ContainerDied","Data":"69d787333de2bb70ad9086fc935dad769b039a6fd2cc173356499236881b2cbe"} Oct 14 13:35:09.959862 master-2 kubenswrapper[4762]: I1014 13:35:09.959734 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=52.79624786 podStartE2EDuration="58.959715107s" podCreationTimestamp="2025-10-14 13:34:11 +0000 UTC" firstStartedPulling="2025-10-14 13:34:28.933371342 +0000 UTC m=+1698.177530501" lastFinishedPulling="2025-10-14 13:34:35.096838569 +0000 UTC m=+1704.340997748" observedRunningTime="2025-10-14 13:35:09.956150693 +0000 UTC m=+1739.200309852" watchObservedRunningTime="2025-10-14 13:35:09.959715107 +0000 UTC m=+1739.203874266" Oct 14 13:35:10.454463 master-2 kubenswrapper[4762]: I1014 13:35:10.454388 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6w2cz" podUID="4ab0fa0c-8873-41ab-b534-7c4c71350245" containerName="ovn-controller" probeResult="failure" output=< Oct 14 13:35:10.454463 master-2 kubenswrapper[4762]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 14 13:35:10.454463 master-2 kubenswrapper[4762]: > Oct 14 13:35:10.510490 master-2 kubenswrapper[4762]: I1014 13:35:10.510349 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:35:10.915950 master-2 kubenswrapper[4762]: I1014 13:35:10.915873 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-1" event={"ID":"94b8bfba-acf4-46d4-ad15-183dafcb7bd0","Type":"ContainerStarted","Data":"1b41687b340043b382a48b2469b01653424d2c64dc5d48404d9428a8dab0c423"} Oct 14 13:35:10.916974 master-2 kubenswrapper[4762]: I1014 13:35:10.916921 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:35:11.043125 master-2 kubenswrapper[4762]: I1014 13:35:11.043049 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-1" podStartSLOduration=60.043030737 podStartE2EDuration="1m0.043030737s" podCreationTimestamp="2025-10-14 13:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:35:11.03557614 +0000 UTC m=+1740.279735299" watchObservedRunningTime="2025-10-14 13:35:11.043030737 +0000 UTC m=+1740.287189896" Oct 14 13:35:12.479701 master-2 kubenswrapper[4762]: I1014 13:35:12.479657 4762 trace.go:236] Trace[181136085]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-1" (14-Oct-2025 13:35:11.411) (total time: 1068ms): Oct 14 13:35:12.479701 master-2 kubenswrapper[4762]: Trace[181136085]: [1.068146558s] [1.068146558s] END Oct 14 13:35:13.275948 master-2 kubenswrapper[4762]: I1014 13:35:13.275859 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-2" podUID="d97acc34-a72a-49c2-83b5-379d3946c591" containerName="galera" probeResult="failure" output=< Oct 14 13:35:13.275948 master-2 kubenswrapper[4762]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Oct 14 13:35:13.275948 master-2 kubenswrapper[4762]: > Oct 14 13:35:13.935048 master-2 kubenswrapper[4762]: I1014 13:35:13.934961 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:13.935700 master-2 kubenswrapper[4762]: I1014 13:35:13.935090 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:13.984084 master-2 kubenswrapper[4762]: I1014 13:35:13.984004 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:15.011397 master-2 kubenswrapper[4762]: I1014 13:35:15.011332 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:15.449139 master-2 kubenswrapper[4762]: I1014 13:35:15.449059 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6w2cz" podUID="4ab0fa0c-8873-41ab-b534-7c4c71350245" containerName="ovn-controller" probeResult="failure" output=< Oct 14 13:35:15.449139 master-2 kubenswrapper[4762]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 14 13:35:15.449139 master-2 kubenswrapper[4762]: > Oct 14 13:35:15.526315 master-2 kubenswrapper[4762]: I1014 13:35:15.526204 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rxkr2" Oct 14 13:35:16.196599 master-2 kubenswrapper[4762]: I1014 13:35:16.196533 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:17.975731 master-2 kubenswrapper[4762]: I1014 13:35:17.974785 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9zcz" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="registry-server" containerID="cri-o://8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5" gracePeriod=2 Oct 14 13:35:18.497771 master-2 kubenswrapper[4762]: I1014 13:35:18.497722 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:18.565394 master-2 kubenswrapper[4762]: I1014 13:35:18.565305 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities\") pod \"a7bdf8a6-ca85-4786-8051-4cf476517541\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " Oct 14 13:35:18.565677 master-2 kubenswrapper[4762]: I1014 13:35:18.565461 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content\") pod \"a7bdf8a6-ca85-4786-8051-4cf476517541\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " Oct 14 13:35:18.565677 master-2 kubenswrapper[4762]: I1014 13:35:18.565494 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvfm8\" (UniqueName: \"kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8\") pod \"a7bdf8a6-ca85-4786-8051-4cf476517541\" (UID: \"a7bdf8a6-ca85-4786-8051-4cf476517541\") " Oct 14 13:35:18.566450 master-2 kubenswrapper[4762]: I1014 13:35:18.566387 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities" (OuterVolumeSpecName: "utilities") pod "a7bdf8a6-ca85-4786-8051-4cf476517541" (UID: "a7bdf8a6-ca85-4786-8051-4cf476517541"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:35:18.569211 master-2 kubenswrapper[4762]: I1014 13:35:18.569170 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8" (OuterVolumeSpecName: "kube-api-access-hvfm8") pod "a7bdf8a6-ca85-4786-8051-4cf476517541" (UID: "a7bdf8a6-ca85-4786-8051-4cf476517541"). InnerVolumeSpecName "kube-api-access-hvfm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:18.610605 master-2 kubenswrapper[4762]: I1014 13:35:18.610532 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7bdf8a6-ca85-4786-8051-4cf476517541" (UID: "a7bdf8a6-ca85-4786-8051-4cf476517541"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:35:18.670174 master-2 kubenswrapper[4762]: I1014 13:35:18.668249 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:18.670174 master-2 kubenswrapper[4762]: I1014 13:35:18.668301 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvfm8\" (UniqueName: \"kubernetes.io/projected/a7bdf8a6-ca85-4786-8051-4cf476517541-kube-api-access-hvfm8\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:18.670174 master-2 kubenswrapper[4762]: I1014 13:35:18.668319 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7bdf8a6-ca85-4786-8051-4cf476517541-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:18.980469 master-2 kubenswrapper[4762]: I1014 13:35:18.980302 4762 generic.go:334] "Generic (PLEG): container finished" podID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerID="8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5" exitCode=0 Oct 14 13:35:18.980469 master-2 kubenswrapper[4762]: I1014 13:35:18.980356 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerDied","Data":"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5"} Oct 14 13:35:18.980469 master-2 kubenswrapper[4762]: I1014 13:35:18.980388 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9zcz" event={"ID":"a7bdf8a6-ca85-4786-8051-4cf476517541","Type":"ContainerDied","Data":"88943ef2a5e3a79fe9307d4c5eb8b3f6670721e222ac5825caa81d29cbc7a2df"} Oct 14 13:35:18.980469 master-2 kubenswrapper[4762]: I1014 13:35:18.980392 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9zcz" Oct 14 13:35:18.980469 master-2 kubenswrapper[4762]: I1014 13:35:18.980407 4762 scope.go:117] "RemoveContainer" containerID="8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5" Oct 14 13:35:18.997402 master-2 kubenswrapper[4762]: I1014 13:35:18.997300 4762 scope.go:117] "RemoveContainer" containerID="cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a" Oct 14 13:35:19.013904 master-2 kubenswrapper[4762]: I1014 13:35:19.013855 4762 scope.go:117] "RemoveContainer" containerID="3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4" Oct 14 13:35:19.039871 master-2 kubenswrapper[4762]: I1014 13:35:19.039820 4762 scope.go:117] "RemoveContainer" containerID="8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5" Oct 14 13:35:19.040451 master-2 kubenswrapper[4762]: E1014 13:35:19.040404 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5\": container with ID starting with 8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5 not found: ID does not exist" containerID="8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5" Oct 14 13:35:19.040527 master-2 kubenswrapper[4762]: I1014 13:35:19.040446 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5"} err="failed to get container status \"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5\": rpc error: code = NotFound desc = could not find container \"8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5\": container with ID starting with 8ab3d19c0c4b93cf394adb413e51a56b59f09f9af056cbc862075305daf296e5 not found: ID does not exist" Oct 14 13:35:19.040527 master-2 kubenswrapper[4762]: I1014 13:35:19.040469 4762 scope.go:117] "RemoveContainer" containerID="cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a" Oct 14 13:35:19.041043 master-2 kubenswrapper[4762]: E1014 13:35:19.040985 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a\": container with ID starting with cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a not found: ID does not exist" containerID="cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a" Oct 14 13:35:19.041094 master-2 kubenswrapper[4762]: I1014 13:35:19.041047 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a"} err="failed to get container status \"cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a\": rpc error: code = NotFound desc = could not find container \"cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a\": container with ID starting with cf85887a41499b64366be09c25b785e31e547da4ce68c7ed905b0f198fab102a not found: ID does not exist" Oct 14 13:35:19.041094 master-2 kubenswrapper[4762]: I1014 13:35:19.041075 4762 scope.go:117] "RemoveContainer" containerID="3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4" Oct 14 13:35:19.045385 master-2 kubenswrapper[4762]: E1014 13:35:19.044342 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4\": container with ID starting with 3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4 not found: ID does not exist" containerID="3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4" Oct 14 13:35:19.045385 master-2 kubenswrapper[4762]: I1014 13:35:19.044393 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4"} err="failed to get container status \"3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4\": rpc error: code = NotFound desc = could not find container \"3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4\": container with ID starting with 3b1781b40970c900bcc089ecf8eb3b4ae880ed666258ea0d43d0eb5e7c4f06d4 not found: ID does not exist" Oct 14 13:35:19.075181 master-2 kubenswrapper[4762]: I1014 13:35:19.074449 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:19.096177 master-2 kubenswrapper[4762]: I1014 13:35:19.095431 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9zcz"] Oct 14 13:35:19.556788 master-2 kubenswrapper[4762]: I1014 13:35:19.556646 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" path="/var/lib/kubelet/pods/a7bdf8a6-ca85-4786-8051-4cf476517541/volumes" Oct 14 13:35:19.911581 master-2 kubenswrapper[4762]: I1014 13:35:19.911504 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: E1014 13:35:19.911755 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="extract-content" Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: I1014 13:35:19.911769 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="extract-content" Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: E1014 13:35:19.911783 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="extract-utilities" Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: I1014 13:35:19.911789 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="extract-utilities" Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: E1014 13:35:19.911809 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="registry-server" Oct 14 13:35:19.911829 master-2 kubenswrapper[4762]: I1014 13:35:19.911815 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="registry-server" Oct 14 13:35:19.912083 master-2 kubenswrapper[4762]: I1014 13:35:19.911935 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7bdf8a6-ca85-4786-8051-4cf476517541" containerName="registry-server" Oct 14 13:35:19.912724 master-2 kubenswrapper[4762]: I1014 13:35:19.912691 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:19.915060 master-2 kubenswrapper[4762]: I1014 13:35:19.915012 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 14 13:35:19.936279 master-2 kubenswrapper[4762]: I1014 13:35:19.933696 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:35:19.995391 master-2 kubenswrapper[4762]: I1014 13:35:19.995306 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:19.995391 master-2 kubenswrapper[4762]: I1014 13:35:19.995389 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzvd\" (UniqueName: \"kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:19.995865 master-2 kubenswrapper[4762]: I1014 13:35:19.995463 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:19.995865 master-2 kubenswrapper[4762]: I1014 13:35:19.995517 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:19.995865 master-2 kubenswrapper[4762]: I1014 13:35:19.995574 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.097566 master-2 kubenswrapper[4762]: I1014 13:35:20.097322 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.097566 master-2 kubenswrapper[4762]: I1014 13:35:20.097436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.097566 master-2 kubenswrapper[4762]: I1014 13:35:20.097461 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swzvd\" (UniqueName: \"kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.097566 master-2 kubenswrapper[4762]: I1014 13:35:20.097494 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.097566 master-2 kubenswrapper[4762]: I1014 13:35:20.097528 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.098441 master-2 kubenswrapper[4762]: I1014 13:35:20.098413 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.098563 master-2 kubenswrapper[4762]: I1014 13:35:20.098414 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.098632 master-2 kubenswrapper[4762]: I1014 13:35:20.098599 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.099086 master-2 kubenswrapper[4762]: I1014 13:35:20.099045 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.133264 master-2 kubenswrapper[4762]: I1014 13:35:20.132715 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzvd\" (UniqueName: \"kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd\") pod \"dnsmasq-dns-68f9c67449-jgsnj\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.238304 master-2 kubenswrapper[4762]: I1014 13:35:20.237769 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:20.405288 master-2 kubenswrapper[4762]: I1014 13:35:20.405111 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6w2cz-config-xh7xl"] Oct 14 13:35:20.406172 master-2 kubenswrapper[4762]: I1014 13:35:20.406131 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.408997 master-2 kubenswrapper[4762]: I1014 13:35:20.408709 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 14 13:35:20.425083 master-2 kubenswrapper[4762]: I1014 13:35:20.419108 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6w2cz-config-xh7xl"] Oct 14 13:35:20.464757 master-2 kubenswrapper[4762]: I1014 13:35:20.464025 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6w2cz" podUID="4ab0fa0c-8873-41ab-b534-7c4c71350245" containerName="ovn-controller" probeResult="failure" output=< Oct 14 13:35:20.464757 master-2 kubenswrapper[4762]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 14 13:35:20.464757 master-2 kubenswrapper[4762]: > Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.505913 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.506002 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.506046 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.506073 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.506104 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wj4\" (UniqueName: \"kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.506582 master-2 kubenswrapper[4762]: I1014 13:35:20.506133 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.610803 master-2 kubenswrapper[4762]: I1014 13:35:20.610751 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.612456 master-2 kubenswrapper[4762]: I1014 13:35:20.612406 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.612571 master-2 kubenswrapper[4762]: I1014 13:35:20.612482 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.612571 master-2 kubenswrapper[4762]: I1014 13:35:20.612523 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78wj4\" (UniqueName: \"kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.612675 master-2 kubenswrapper[4762]: I1014 13:35:20.612578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.612800 master-2 kubenswrapper[4762]: I1014 13:35:20.612777 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.613003 master-2 kubenswrapper[4762]: I1014 13:35:20.612981 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.613096 master-2 kubenswrapper[4762]: I1014 13:35:20.613051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.614318 master-2 kubenswrapper[4762]: I1014 13:35:20.613804 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.615370 master-2 kubenswrapper[4762]: I1014 13:35:20.615321 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.615926 master-2 kubenswrapper[4762]: I1014 13:35:20.615889 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.633349 master-2 kubenswrapper[4762]: I1014 13:35:20.633311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wj4\" (UniqueName: \"kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4\") pod \"ovn-controller-6w2cz-config-xh7xl\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:20.703328 master-2 kubenswrapper[4762]: I1014 13:35:20.703102 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:35:20.709488 master-2 kubenswrapper[4762]: W1014 13:35:20.709433 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90052432_150d_4d93_b058_513adf55e099.slice/crio-403a1761b43668a95715d9e22a51cc31bd9ed8190825b29922ab14e99825ee55 WatchSource:0}: Error finding container 403a1761b43668a95715d9e22a51cc31bd9ed8190825b29922ab14e99825ee55: Status 404 returned error can't find the container with id 403a1761b43668a95715d9e22a51cc31bd9ed8190825b29922ab14e99825ee55 Oct 14 13:35:20.737184 master-2 kubenswrapper[4762]: I1014 13:35:20.737092 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:21.019626 master-2 kubenswrapper[4762]: I1014 13:35:21.019027 4762 generic.go:334] "Generic (PLEG): container finished" podID="90052432-150d-4d93-b058-513adf55e099" containerID="f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773" exitCode=0 Oct 14 13:35:21.019626 master-2 kubenswrapper[4762]: I1014 13:35:21.019136 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" event={"ID":"90052432-150d-4d93-b058-513adf55e099","Type":"ContainerDied","Data":"f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773"} Oct 14 13:35:21.019626 master-2 kubenswrapper[4762]: I1014 13:35:21.019260 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" event={"ID":"90052432-150d-4d93-b058-513adf55e099","Type":"ContainerStarted","Data":"403a1761b43668a95715d9e22a51cc31bd9ed8190825b29922ab14e99825ee55"} Oct 14 13:35:21.163538 master-2 kubenswrapper[4762]: I1014 13:35:21.163483 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6w2cz-config-xh7xl"] Oct 14 13:35:21.186106 master-2 kubenswrapper[4762]: W1014 13:35:21.184896 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70395194_81d7_4200_bba7_7936961c5201.slice/crio-a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff WatchSource:0}: Error finding container a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff: Status 404 returned error can't find the container with id a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff Oct 14 13:35:22.029507 master-2 kubenswrapper[4762]: I1014 13:35:22.029364 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" event={"ID":"90052432-150d-4d93-b058-513adf55e099","Type":"ContainerStarted","Data":"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd"} Oct 14 13:35:22.029507 master-2 kubenswrapper[4762]: I1014 13:35:22.029470 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:22.031327 master-2 kubenswrapper[4762]: I1014 13:35:22.031293 4762 generic.go:334] "Generic (PLEG): container finished" podID="70395194-81d7-4200-bba7-7936961c5201" containerID="28f1251bdd543d9279723f209f9af929cafbfc888787b806329edf838eed9e3b" exitCode=0 Oct 14 13:35:22.031327 master-2 kubenswrapper[4762]: I1014 13:35:22.031326 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6w2cz-config-xh7xl" event={"ID":"70395194-81d7-4200-bba7-7936961c5201","Type":"ContainerDied","Data":"28f1251bdd543d9279723f209f9af929cafbfc888787b806329edf838eed9e3b"} Oct 14 13:35:22.031432 master-2 kubenswrapper[4762]: I1014 13:35:22.031347 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6w2cz-config-xh7xl" event={"ID":"70395194-81d7-4200-bba7-7936961c5201","Type":"ContainerStarted","Data":"a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff"} Oct 14 13:35:22.053721 master-2 kubenswrapper[4762]: I1014 13:35:22.053633 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" podStartSLOduration=3.053612586 podStartE2EDuration="3.053612586s" podCreationTimestamp="2025-10-14 13:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:35:22.051715346 +0000 UTC m=+1751.295874505" watchObservedRunningTime="2025-10-14 13:35:22.053612586 +0000 UTC m=+1751.297771745" Oct 14 13:35:22.534080 master-2 kubenswrapper[4762]: I1014 13:35:22.534029 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-1" Oct 14 13:35:22.575088 master-2 kubenswrapper[4762]: I1014 13:35:22.574905 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-1" Oct 14 13:35:23.478947 master-2 kubenswrapper[4762]: I1014 13:35:23.478900 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:23.566340 master-2 kubenswrapper[4762]: I1014 13:35:23.566301 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78wj4\" (UniqueName: \"kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566559 master-2 kubenswrapper[4762]: I1014 13:35:23.566359 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566559 master-2 kubenswrapper[4762]: I1014 13:35:23.566426 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566559 master-2 kubenswrapper[4762]: I1014 13:35:23.566494 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566753 master-2 kubenswrapper[4762]: I1014 13:35:23.566614 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566753 master-2 kubenswrapper[4762]: I1014 13:35:23.566693 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts\") pod \"70395194-81d7-4200-bba7-7936961c5201\" (UID: \"70395194-81d7-4200-bba7-7936961c5201\") " Oct 14 13:35:23.566991 master-2 kubenswrapper[4762]: I1014 13:35:23.566960 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:35:23.567139 master-2 kubenswrapper[4762]: I1014 13:35:23.567068 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:35:23.567256 master-2 kubenswrapper[4762]: I1014 13:35:23.566989 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run" (OuterVolumeSpecName: "var-run") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:35:23.567676 master-2 kubenswrapper[4762]: I1014 13:35:23.567635 4762 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-log-ovn\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:23.567778 master-2 kubenswrapper[4762]: I1014 13:35:23.567672 4762 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run-ovn\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:23.567778 master-2 kubenswrapper[4762]: I1014 13:35:23.567696 4762 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70395194-81d7-4200-bba7-7936961c5201-var-run\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:23.567900 master-2 kubenswrapper[4762]: I1014 13:35:23.567865 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:23.568807 master-2 kubenswrapper[4762]: I1014 13:35:23.568775 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts" (OuterVolumeSpecName: "scripts") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:23.570702 master-2 kubenswrapper[4762]: I1014 13:35:23.570663 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4" (OuterVolumeSpecName: "kube-api-access-78wj4") pod "70395194-81d7-4200-bba7-7936961c5201" (UID: "70395194-81d7-4200-bba7-7936961c5201"). InnerVolumeSpecName "kube-api-access-78wj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:23.677111 master-2 kubenswrapper[4762]: I1014 13:35:23.677022 4762 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-additional-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:23.677111 master-2 kubenswrapper[4762]: I1014 13:35:23.677110 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70395194-81d7-4200-bba7-7936961c5201-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:23.677382 master-2 kubenswrapper[4762]: I1014 13:35:23.677130 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78wj4\" (UniqueName: \"kubernetes.io/projected/70395194-81d7-4200-bba7-7936961c5201-kube-api-access-78wj4\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:24.050719 master-2 kubenswrapper[4762]: I1014 13:35:24.050673 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6w2cz-config-xh7xl" Oct 14 13:35:24.050719 master-2 kubenswrapper[4762]: I1014 13:35:24.050653 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6w2cz-config-xh7xl" event={"ID":"70395194-81d7-4200-bba7-7936961c5201","Type":"ContainerDied","Data":"a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff"} Oct 14 13:35:24.050977 master-2 kubenswrapper[4762]: I1014 13:35:24.050740 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f6e7a92198d1b29be7e7e1b1f1442e3a070b5313beaa704ee9d4911afac3ff" Oct 14 13:35:24.615260 master-2 kubenswrapper[4762]: I1014 13:35:24.615198 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6w2cz-config-xh7xl"] Oct 14 13:35:24.636124 master-2 kubenswrapper[4762]: I1014 13:35:24.636041 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6w2cz-config-xh7xl"] Oct 14 13:35:24.926428 master-2 kubenswrapper[4762]: I1014 13:35:24.926379 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-1" Oct 14 13:35:25.415519 master-2 kubenswrapper[4762]: I1014 13:35:25.415439 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Oct 14 13:35:25.457319 master-2 kubenswrapper[4762]: I1014 13:35:25.457182 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6w2cz" Oct 14 13:35:25.558116 master-2 kubenswrapper[4762]: I1014 13:35:25.558034 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70395194-81d7-4200-bba7-7936961c5201" path="/var/lib/kubelet/pods/70395194-81d7-4200-bba7-7936961c5201/volumes" Oct 14 13:35:30.242099 master-2 kubenswrapper[4762]: I1014 13:35:30.241969 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:35:30.411710 master-2 kubenswrapper[4762]: I1014 13:35:30.411614 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:35:30.412059 master-2 kubenswrapper[4762]: I1014 13:35:30.411967 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d999b9745-tmsff" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="dnsmasq-dns" containerID="cri-o://63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e" gracePeriod=10 Oct 14 13:35:30.997928 master-2 kubenswrapper[4762]: I1014 13:35:30.997877 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:35:31.103199 master-2 kubenswrapper[4762]: I1014 13:35:31.103060 4762 generic.go:334] "Generic (PLEG): container finished" podID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerID="63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e" exitCode=0 Oct 14 13:35:31.103199 master-2 kubenswrapper[4762]: I1014 13:35:31.103104 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d999b9745-tmsff" event={"ID":"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e","Type":"ContainerDied","Data":"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e"} Oct 14 13:35:31.103199 master-2 kubenswrapper[4762]: I1014 13:35:31.103135 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d999b9745-tmsff" event={"ID":"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e","Type":"ContainerDied","Data":"bc72419c0f79c3ab88e4eedc933458a5a9e2de8d8b1e5290a5c6c3d2b228c2f0"} Oct 14 13:35:31.103199 master-2 kubenswrapper[4762]: I1014 13:35:31.103167 4762 scope.go:117] "RemoveContainer" containerID="63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e" Oct 14 13:35:31.103469 master-2 kubenswrapper[4762]: I1014 13:35:31.103279 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d999b9745-tmsff" Oct 14 13:35:31.104857 master-2 kubenswrapper[4762]: I1014 13:35:31.104829 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb\") pod \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " Oct 14 13:35:31.104953 master-2 kubenswrapper[4762]: I1014 13:35:31.104931 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config\") pod \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " Oct 14 13:35:31.105017 master-2 kubenswrapper[4762]: I1014 13:35:31.104962 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwtnm\" (UniqueName: \"kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm\") pod \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " Oct 14 13:35:31.105062 master-2 kubenswrapper[4762]: I1014 13:35:31.105035 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc\") pod \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\" (UID: \"77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e\") " Oct 14 13:35:31.107723 master-2 kubenswrapper[4762]: I1014 13:35:31.107656 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm" (OuterVolumeSpecName: "kube-api-access-zwtnm") pod "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" (UID: "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e"). InnerVolumeSpecName "kube-api-access-zwtnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:31.137264 master-2 kubenswrapper[4762]: I1014 13:35:31.137205 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" (UID: "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:31.137889 master-2 kubenswrapper[4762]: I1014 13:35:31.137858 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" (UID: "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:31.139598 master-2 kubenswrapper[4762]: I1014 13:35:31.139568 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config" (OuterVolumeSpecName: "config") pod "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" (UID: "77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:31.185319 master-2 kubenswrapper[4762]: I1014 13:35:31.185227 4762 scope.go:117] "RemoveContainer" containerID="678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753" Oct 14 13:35:31.202069 master-2 kubenswrapper[4762]: I1014 13:35:31.202033 4762 scope.go:117] "RemoveContainer" containerID="63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e" Oct 14 13:35:31.202600 master-2 kubenswrapper[4762]: E1014 13:35:31.202541 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e\": container with ID starting with 63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e not found: ID does not exist" containerID="63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e" Oct 14 13:35:31.202735 master-2 kubenswrapper[4762]: I1014 13:35:31.202613 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e"} err="failed to get container status \"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e\": rpc error: code = NotFound desc = could not find container \"63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e\": container with ID starting with 63c3e09ac5c44d7c161f4daaa5b9e5e0b78187ff235d9db7eb96a727f56ff32e not found: ID does not exist" Oct 14 13:35:31.202735 master-2 kubenswrapper[4762]: I1014 13:35:31.202656 4762 scope.go:117] "RemoveContainer" containerID="678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753" Oct 14 13:35:31.203099 master-2 kubenswrapper[4762]: E1014 13:35:31.203063 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753\": container with ID starting with 678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753 not found: ID does not exist" containerID="678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753" Oct 14 13:35:31.203197 master-2 kubenswrapper[4762]: I1014 13:35:31.203111 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753"} err="failed to get container status \"678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753\": rpc error: code = NotFound desc = could not find container \"678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753\": container with ID starting with 678a14d98662f7557ef259d3f2c9667a1ce0dfa2b68c4e5d7e64b14545c9e753 not found: ID does not exist" Oct 14 13:35:31.207301 master-2 kubenswrapper[4762]: I1014 13:35:31.207271 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:31.207301 master-2 kubenswrapper[4762]: I1014 13:35:31.207297 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwtnm\" (UniqueName: \"kubernetes.io/projected/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-kube-api-access-zwtnm\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:31.207399 master-2 kubenswrapper[4762]: I1014 13:35:31.207308 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:31.207399 master-2 kubenswrapper[4762]: I1014 13:35:31.207317 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e-ovsdbserver-nb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:32.508543 master-2 kubenswrapper[4762]: I1014 13:35:32.508416 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:35:32.655925 master-2 kubenswrapper[4762]: I1014 13:35:32.655842 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d999b9745-tmsff"] Oct 14 13:35:33.561936 master-2 kubenswrapper[4762]: I1014 13:35:33.561860 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" path="/var/lib/kubelet/pods/77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e/volumes" Oct 14 13:35:50.814850 master-2 kubenswrapper[4762]: I1014 13:35:50.814799 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5dnhm"] Oct 14 13:35:50.815903 master-2 kubenswrapper[4762]: E1014 13:35:50.815881 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="dnsmasq-dns" Oct 14 13:35:50.816002 master-2 kubenswrapper[4762]: I1014 13:35:50.815989 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="dnsmasq-dns" Oct 14 13:35:50.816091 master-2 kubenswrapper[4762]: E1014 13:35:50.816078 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70395194-81d7-4200-bba7-7936961c5201" containerName="ovn-config" Oct 14 13:35:50.816279 master-2 kubenswrapper[4762]: I1014 13:35:50.816265 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="70395194-81d7-4200-bba7-7936961c5201" containerName="ovn-config" Oct 14 13:35:50.816399 master-2 kubenswrapper[4762]: E1014 13:35:50.816386 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="init" Oct 14 13:35:50.816476 master-2 kubenswrapper[4762]: I1014 13:35:50.816463 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="init" Oct 14 13:35:50.816742 master-2 kubenswrapper[4762]: I1014 13:35:50.816719 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="70395194-81d7-4200-bba7-7936961c5201" containerName="ovn-config" Oct 14 13:35:50.816881 master-2 kubenswrapper[4762]: I1014 13:35:50.816863 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b15a2b-19a2-4f4f-a223-ef3c9a4a0a4e" containerName="dnsmasq-dns" Oct 14 13:35:50.817713 master-2 kubenswrapper[4762]: I1014 13:35:50.817691 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.820681 master-2 kubenswrapper[4762]: I1014 13:35:50.820620 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:35:50.820878 master-2 kubenswrapper[4762]: I1014 13:35:50.820842 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:35:50.821390 master-2 kubenswrapper[4762]: I1014 13:35:50.821275 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:35:50.854591 master-2 kubenswrapper[4762]: I1014 13:35:50.854542 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5dnhm"] Oct 14 13:35:50.878559 master-2 kubenswrapper[4762]: I1014 13:35:50.878419 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:50.887259 master-2 kubenswrapper[4762]: I1014 13:35:50.886034 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:50.891626 master-2 kubenswrapper[4762]: I1014 13:35:50.891563 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 14 13:35:50.901329 master-2 kubenswrapper[4762]: I1014 13:35:50.901280 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:50.941976 master-2 kubenswrapper[4762]: I1014 13:35:50.941915 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.942231 master-2 kubenswrapper[4762]: I1014 13:35:50.942026 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.942231 master-2 kubenswrapper[4762]: I1014 13:35:50.942085 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.942231 master-2 kubenswrapper[4762]: I1014 13:35:50.942133 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd72v\" (UniqueName: \"kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.942231 master-2 kubenswrapper[4762]: I1014 13:35:50.942221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:50.942533 master-2 kubenswrapper[4762]: I1014 13:35:50.942271 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044267 master-2 kubenswrapper[4762]: I1014 13:35:51.044199 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.044267 master-2 kubenswrapper[4762]: I1014 13:35:51.044263 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044297 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044340 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044378 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044411 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044435 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044460 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044486 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd72v\" (UniqueName: \"kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044516 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42z6r\" (UniqueName: \"kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.044542 master-2 kubenswrapper[4762]: I1014 13:35:51.044541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.045007 master-2 kubenswrapper[4762]: I1014 13:35:51.044589 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.051195 master-2 kubenswrapper[4762]: I1014 13:35:51.049975 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.051195 master-2 kubenswrapper[4762]: I1014 13:35:51.051000 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.052751 master-2 kubenswrapper[4762]: I1014 13:35:51.052691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.055375 master-2 kubenswrapper[4762]: I1014 13:35:51.055340 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.055541 master-2 kubenswrapper[4762]: I1014 13:35:51.055404 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.077766 master-2 kubenswrapper[4762]: I1014 13:35:51.077613 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd72v\" (UniqueName: \"kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v\") pod \"keystone-bootstrap-5dnhm\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.135935 master-2 kubenswrapper[4762]: I1014 13:35:51.135779 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:35:51.145769 master-2 kubenswrapper[4762]: I1014 13:35:51.145725 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.145769 master-2 kubenswrapper[4762]: I1014 13:35:51.145765 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.145983 master-2 kubenswrapper[4762]: I1014 13:35:51.145799 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.145983 master-2 kubenswrapper[4762]: I1014 13:35:51.145840 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.145983 master-2 kubenswrapper[4762]: I1014 13:35:51.145869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42z6r\" (UniqueName: \"kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.145983 master-2 kubenswrapper[4762]: I1014 13:35:51.145888 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.146729 master-2 kubenswrapper[4762]: I1014 13:35:51.146704 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.147521 master-2 kubenswrapper[4762]: I1014 13:35:51.147463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.147722 master-2 kubenswrapper[4762]: I1014 13:35:51.147691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.147863 master-2 kubenswrapper[4762]: I1014 13:35:51.147839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.148053 master-2 kubenswrapper[4762]: I1014 13:35:51.148038 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.170222 master-2 kubenswrapper[4762]: I1014 13:35:51.170174 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:35:51.172325 master-2 kubenswrapper[4762]: I1014 13:35:51.172278 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:35:51.225989 master-2 kubenswrapper[4762]: I1014 13:35:51.225237 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:35:51.231595 master-2 kubenswrapper[4762]: I1014 13:35:51.231558 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:35:51.242973 master-2 kubenswrapper[4762]: I1014 13:35:51.242925 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42z6r\" (UniqueName: \"kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r\") pod \"dnsmasq-dns-644df97595-vxrmr\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.247273 master-2 kubenswrapper[4762]: I1014 13:35:51.247171 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247273 master-2 kubenswrapper[4762]: I1014 13:35:51.247254 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247421 master-2 kubenswrapper[4762]: I1014 13:35:51.247302 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247421 master-2 kubenswrapper[4762]: I1014 13:35:51.247336 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247421 master-2 kubenswrapper[4762]: I1014 13:35:51.247391 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247421 master-2 kubenswrapper[4762]: I1014 13:35:51.247414 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.247644 master-2 kubenswrapper[4762]: I1014 13:35:51.247452 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwmv\" (UniqueName: \"kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.262589 master-2 kubenswrapper[4762]: I1014 13:35:51.262537 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349516 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349578 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349601 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349645 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349666 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwmv\" (UniqueName: \"kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350256 master-2 kubenswrapper[4762]: I1014 13:35:51.349971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.350807 master-2 kubenswrapper[4762]: I1014 13:35:51.350413 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.360230 master-2 kubenswrapper[4762]: I1014 13:35:51.353565 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.360230 master-2 kubenswrapper[4762]: I1014 13:35:51.353923 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.360230 master-2 kubenswrapper[4762]: I1014 13:35:51.357537 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.360230 master-2 kubenswrapper[4762]: I1014 13:35:51.357691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.381424 master-2 kubenswrapper[4762]: I1014 13:35:51.377872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwmv\" (UniqueName: \"kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv\") pod \"ceilometer-0\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " pod="openstack/ceilometer-0" Oct 14 13:35:51.513112 master-2 kubenswrapper[4762]: I1014 13:35:51.513029 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:51.589875 master-2 kubenswrapper[4762]: I1014 13:35:51.589817 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:35:51.748180 master-2 kubenswrapper[4762]: I1014 13:35:51.746792 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:51.758822 master-2 kubenswrapper[4762]: I1014 13:35:51.758744 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5dnhm"] Oct 14 13:35:51.765835 master-2 kubenswrapper[4762]: W1014 13:35:51.765237 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ce229d_4981_44dd_9a2a_ec048ec56a0f.slice/crio-36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2 WatchSource:0}: Error finding container 36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2: Status 404 returned error can't find the container with id 36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2 Oct 14 13:35:51.969491 master-2 kubenswrapper[4762]: I1014 13:35:51.969425 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:52.066665 master-2 kubenswrapper[4762]: I1014 13:35:52.066548 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:35:52.180557 master-2 kubenswrapper[4762]: W1014 13:35:52.180485 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod520f5fd5_5c4b_4781_bc3f_363b18244e36.slice/crio-34a9eea30f534c6dca6a675c4ae28f647461a219c6d471c0895a97d54fad7d53 WatchSource:0}: Error finding container 34a9eea30f534c6dca6a675c4ae28f647461a219c6d471c0895a97d54fad7d53: Status 404 returned error can't find the container with id 34a9eea30f534c6dca6a675c4ae28f647461a219c6d471c0895a97d54fad7d53 Oct 14 13:35:52.313923 master-2 kubenswrapper[4762]: I1014 13:35:52.313836 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5dnhm" event={"ID":"03ce229d-4981-44dd-9a2a-ec048ec56a0f","Type":"ContainerStarted","Data":"36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2"} Oct 14 13:35:52.315284 master-2 kubenswrapper[4762]: I1014 13:35:52.315254 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerStarted","Data":"34a9eea30f534c6dca6a675c4ae28f647461a219c6d471c0895a97d54fad7d53"} Oct 14 13:35:52.317054 master-2 kubenswrapper[4762]: I1014 13:35:52.317025 4762 generic.go:334] "Generic (PLEG): container finished" podID="45f0bdd3-1ee4-4037-aa06-3f8601b583ec" containerID="df14fba0ee051287490fcf66fca54373248bff5c6b65199b253e736d01d529a2" exitCode=0 Oct 14 13:35:52.317054 master-2 kubenswrapper[4762]: I1014 13:35:52.317055 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644df97595-vxrmr" event={"ID":"45f0bdd3-1ee4-4037-aa06-3f8601b583ec","Type":"ContainerDied","Data":"df14fba0ee051287490fcf66fca54373248bff5c6b65199b253e736d01d529a2"} Oct 14 13:35:52.317366 master-2 kubenswrapper[4762]: I1014 13:35:52.317071 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644df97595-vxrmr" event={"ID":"45f0bdd3-1ee4-4037-aa06-3f8601b583ec","Type":"ContainerStarted","Data":"e66ba4560bdea2d390c50cd9bfb44aca42bbee95b1ebe1a4803738ae90859fde"} Oct 14 13:35:52.683124 master-2 kubenswrapper[4762]: I1014 13:35:52.683065 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:52.775761 master-2 kubenswrapper[4762]: I1014 13:35:52.775711 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.776104 master-2 kubenswrapper[4762]: I1014 13:35:52.775782 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.776104 master-2 kubenswrapper[4762]: I1014 13:35:52.775804 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.776104 master-2 kubenswrapper[4762]: I1014 13:35:52.775844 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.776104 master-2 kubenswrapper[4762]: I1014 13:35:52.775870 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.776104 master-2 kubenswrapper[4762]: I1014 13:35:52.775986 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42z6r\" (UniqueName: \"kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r\") pod \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\" (UID: \"45f0bdd3-1ee4-4037-aa06-3f8601b583ec\") " Oct 14 13:35:52.779495 master-2 kubenswrapper[4762]: I1014 13:35:52.779410 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r" (OuterVolumeSpecName: "kube-api-access-42z6r") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "kube-api-access-42z6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:35:52.795215 master-2 kubenswrapper[4762]: I1014 13:35:52.795058 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:52.796856 master-2 kubenswrapper[4762]: I1014 13:35:52.796689 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:52.796974 master-2 kubenswrapper[4762]: I1014 13:35:52.796911 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:52.797188 master-2 kubenswrapper[4762]: I1014 13:35:52.797120 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:52.798883 master-2 kubenswrapper[4762]: I1014 13:35:52.798854 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config" (OuterVolumeSpecName: "config") pod "45f0bdd3-1ee4-4037-aa06-3f8601b583ec" (UID: "45f0bdd3-1ee4-4037-aa06-3f8601b583ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878592 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42z6r\" (UniqueName: \"kubernetes.io/projected/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-kube-api-access-42z6r\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878661 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878682 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878701 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-nb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878720 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-dns-swift-storage-0\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:52.878738 master-2 kubenswrapper[4762]: I1014 13:35:52.878738 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f0bdd3-1ee4-4037-aa06-3f8601b583ec-ovsdbserver-sb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:35:53.326135 master-2 kubenswrapper[4762]: I1014 13:35:53.325966 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644df97595-vxrmr" event={"ID":"45f0bdd3-1ee4-4037-aa06-3f8601b583ec","Type":"ContainerDied","Data":"e66ba4560bdea2d390c50cd9bfb44aca42bbee95b1ebe1a4803738ae90859fde"} Oct 14 13:35:53.326135 master-2 kubenswrapper[4762]: I1014 13:35:53.326049 4762 scope.go:117] "RemoveContainer" containerID="df14fba0ee051287490fcf66fca54373248bff5c6b65199b253e736d01d529a2" Oct 14 13:35:53.326135 master-2 kubenswrapper[4762]: I1014 13:35:53.326006 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644df97595-vxrmr" Oct 14 13:35:53.822130 master-2 kubenswrapper[4762]: I1014 13:35:53.822052 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:53.906404 master-2 kubenswrapper[4762]: I1014 13:35:53.903637 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644df97595-vxrmr"] Oct 14 13:35:55.314301 master-2 kubenswrapper[4762]: I1014 13:35:55.314212 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:35:55.578292 master-2 kubenswrapper[4762]: I1014 13:35:55.576953 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f0bdd3-1ee4-4037-aa06-3f8601b583ec" path="/var/lib/kubelet/pods/45f0bdd3-1ee4-4037-aa06-3f8601b583ec/volumes" Oct 14 13:35:59.381235 master-2 kubenswrapper[4762]: I1014 13:35:59.381169 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5dnhm" event={"ID":"03ce229d-4981-44dd-9a2a-ec048ec56a0f","Type":"ContainerStarted","Data":"0dc728929be840cf856fc9818c135fbd54ffa7a02abc7eb031c207658c929aca"} Oct 14 13:35:59.384140 master-2 kubenswrapper[4762]: I1014 13:35:59.384083 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerStarted","Data":"638bce9e30d01056c9bf0af37a0bd392f3f2675ef3ca7ea5aed5ae76cf0225e4"} Oct 14 13:35:59.425730 master-2 kubenswrapper[4762]: I1014 13:35:59.425614 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5dnhm" podStartSLOduration=2.549964799 podStartE2EDuration="9.425585894s" podCreationTimestamp="2025-10-14 13:35:50 +0000 UTC" firstStartedPulling="2025-10-14 13:35:51.767413051 +0000 UTC m=+1781.011572210" lastFinishedPulling="2025-10-14 13:35:58.643034146 +0000 UTC m=+1787.887193305" observedRunningTime="2025-10-14 13:35:59.419999909 +0000 UTC m=+1788.664159078" watchObservedRunningTime="2025-10-14 13:35:59.425585894 +0000 UTC m=+1788.669745053" Oct 14 13:36:00.397957 master-2 kubenswrapper[4762]: I1014 13:36:00.397884 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerStarted","Data":"15c265e51aed24697243505daa7e2d8b6b252c7cf9d56345c510ab502e3cda27"} Oct 14 13:36:03.421225 master-2 kubenswrapper[4762]: I1014 13:36:03.421141 4762 generic.go:334] "Generic (PLEG): container finished" podID="03ce229d-4981-44dd-9a2a-ec048ec56a0f" containerID="0dc728929be840cf856fc9818c135fbd54ffa7a02abc7eb031c207658c929aca" exitCode=0 Oct 14 13:36:03.421860 master-2 kubenswrapper[4762]: I1014 13:36:03.421221 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5dnhm" event={"ID":"03ce229d-4981-44dd-9a2a-ec048ec56a0f","Type":"ContainerDied","Data":"0dc728929be840cf856fc9818c135fbd54ffa7a02abc7eb031c207658c929aca"} Oct 14 13:36:04.882836 master-2 kubenswrapper[4762]: I1014 13:36:04.882797 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:36:04.941491 master-2 kubenswrapper[4762]: I1014 13:36:04.941364 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.941491 master-2 kubenswrapper[4762]: I1014 13:36:04.941439 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd72v\" (UniqueName: \"kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.941491 master-2 kubenswrapper[4762]: I1014 13:36:04.941486 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.942353 master-2 kubenswrapper[4762]: I1014 13:36:04.941548 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.942353 master-2 kubenswrapper[4762]: I1014 13:36:04.941583 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.942353 master-2 kubenswrapper[4762]: I1014 13:36:04.941612 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts\") pod \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\" (UID: \"03ce229d-4981-44dd-9a2a-ec048ec56a0f\") " Oct 14 13:36:04.947535 master-2 kubenswrapper[4762]: I1014 13:36:04.947436 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts" (OuterVolumeSpecName: "scripts") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:04.950233 master-2 kubenswrapper[4762]: I1014 13:36:04.950169 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v" (OuterVolumeSpecName: "kube-api-access-wd72v") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "kube-api-access-wd72v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:04.952433 master-2 kubenswrapper[4762]: I1014 13:36:04.952373 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:04.953982 master-2 kubenswrapper[4762]: I1014 13:36:04.953885 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:04.973078 master-2 kubenswrapper[4762]: I1014 13:36:04.973000 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data" (OuterVolumeSpecName: "config-data") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:04.977691 master-2 kubenswrapper[4762]: I1014 13:36:04.977592 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03ce229d-4981-44dd-9a2a-ec048ec56a0f" (UID: "03ce229d-4981-44dd-9a2a-ec048ec56a0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043372 4762 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-fernet-keys\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043414 4762 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-credential-keys\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043425 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043435 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043448 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd72v\" (UniqueName: \"kubernetes.io/projected/03ce229d-4981-44dd-9a2a-ec048ec56a0f-kube-api-access-wd72v\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.043514 master-2 kubenswrapper[4762]: I1014 13:36:05.043459 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03ce229d-4981-44dd-9a2a-ec048ec56a0f-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:05.437828 master-2 kubenswrapper[4762]: I1014 13:36:05.437746 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5dnhm" Oct 14 13:36:05.438201 master-2 kubenswrapper[4762]: I1014 13:36:05.437734 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5dnhm" event={"ID":"03ce229d-4981-44dd-9a2a-ec048ec56a0f","Type":"ContainerDied","Data":"36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2"} Oct 14 13:36:05.438201 master-2 kubenswrapper[4762]: I1014 13:36:05.437884 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36710fe5f7409f8258dba5f3c44d15ebaf9d156a7b01141527bad0973a8a87c2" Oct 14 13:36:05.442085 master-2 kubenswrapper[4762]: I1014 13:36:05.442030 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerStarted","Data":"bbfd93e2ee92e007c9a4e57ddc4c0b70020d7466986d1c3c5670d6b2d4304fb8"} Oct 14 13:36:05.732642 master-2 kubenswrapper[4762]: I1014 13:36:05.732523 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5dnhm"] Oct 14 13:36:05.739144 master-2 kubenswrapper[4762]: I1014 13:36:05.739090 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5dnhm"] Oct 14 13:36:05.770960 master-2 kubenswrapper[4762]: I1014 13:36:05.770898 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bkjjv"] Oct 14 13:36:05.771267 master-2 kubenswrapper[4762]: E1014 13:36:05.771243 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ce229d-4981-44dd-9a2a-ec048ec56a0f" containerName="keystone-bootstrap" Oct 14 13:36:05.771267 master-2 kubenswrapper[4762]: I1014 13:36:05.771264 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ce229d-4981-44dd-9a2a-ec048ec56a0f" containerName="keystone-bootstrap" Oct 14 13:36:05.771358 master-2 kubenswrapper[4762]: E1014 13:36:05.771287 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f0bdd3-1ee4-4037-aa06-3f8601b583ec" containerName="init" Oct 14 13:36:05.771358 master-2 kubenswrapper[4762]: I1014 13:36:05.771296 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f0bdd3-1ee4-4037-aa06-3f8601b583ec" containerName="init" Oct 14 13:36:05.771468 master-2 kubenswrapper[4762]: I1014 13:36:05.771447 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f0bdd3-1ee4-4037-aa06-3f8601b583ec" containerName="init" Oct 14 13:36:05.771505 master-2 kubenswrapper[4762]: I1014 13:36:05.771479 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ce229d-4981-44dd-9a2a-ec048ec56a0f" containerName="keystone-bootstrap" Oct 14 13:36:05.772205 master-2 kubenswrapper[4762]: I1014 13:36:05.772184 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.774254 master-2 kubenswrapper[4762]: I1014 13:36:05.774228 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:36:05.774362 master-2 kubenswrapper[4762]: I1014 13:36:05.774283 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:36:05.774465 master-2 kubenswrapper[4762]: I1014 13:36:05.774445 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:36:05.807721 master-2 kubenswrapper[4762]: I1014 13:36:05.807658 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bkjjv"] Oct 14 13:36:05.857387 master-2 kubenswrapper[4762]: I1014 13:36:05.857277 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.857387 master-2 kubenswrapper[4762]: I1014 13:36:05.857371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckncp\" (UniqueName: \"kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.857695 master-2 kubenswrapper[4762]: I1014 13:36:05.857649 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.857834 master-2 kubenswrapper[4762]: I1014 13:36:05.857794 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.857934 master-2 kubenswrapper[4762]: I1014 13:36:05.857907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.857985 master-2 kubenswrapper[4762]: I1014 13:36:05.857976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959220 master-2 kubenswrapper[4762]: I1014 13:36:05.959127 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959220 master-2 kubenswrapper[4762]: I1014 13:36:05.959194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckncp\" (UniqueName: \"kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959220 master-2 kubenswrapper[4762]: I1014 13:36:05.959235 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959979 master-2 kubenswrapper[4762]: I1014 13:36:05.959266 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959979 master-2 kubenswrapper[4762]: I1014 13:36:05.959299 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.959979 master-2 kubenswrapper[4762]: I1014 13:36:05.959324 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.962817 master-2 kubenswrapper[4762]: I1014 13:36:05.962764 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.963705 master-2 kubenswrapper[4762]: I1014 13:36:05.963646 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.963805 master-2 kubenswrapper[4762]: I1014 13:36:05.963769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.965062 master-2 kubenswrapper[4762]: I1014 13:36:05.965025 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.965323 master-2 kubenswrapper[4762]: I1014 13:36:05.965279 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:05.986648 master-2 kubenswrapper[4762]: I1014 13:36:05.986480 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckncp\" (UniqueName: \"kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp\") pod \"keystone-bootstrap-bkjjv\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:06.086198 master-2 kubenswrapper[4762]: I1014 13:36:06.086078 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:06.444733 master-2 kubenswrapper[4762]: I1014 13:36:06.444616 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-g5vgq"] Oct 14 13:36:06.446838 master-2 kubenswrapper[4762]: I1014 13:36:06.446767 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.454585 master-2 kubenswrapper[4762]: I1014 13:36:06.454526 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Oct 14 13:36:06.456181 master-2 kubenswrapper[4762]: I1014 13:36:06.456100 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Oct 14 13:36:06.467675 master-2 kubenswrapper[4762]: I1014 13:36:06.467612 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.467883 master-2 kubenswrapper[4762]: I1014 13:36:06.467708 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.467883 master-2 kubenswrapper[4762]: I1014 13:36:06.467749 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.467883 master-2 kubenswrapper[4762]: I1014 13:36:06.467795 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq92w\" (UniqueName: \"kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.467883 master-2 kubenswrapper[4762]: I1014 13:36:06.467848 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.468061 master-2 kubenswrapper[4762]: I1014 13:36:06.467906 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.477975 master-2 kubenswrapper[4762]: I1014 13:36:06.477910 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-g5vgq"] Oct 14 13:36:06.575048 master-2 kubenswrapper[4762]: I1014 13:36:06.574973 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.575143 master-2 kubenswrapper[4762]: I1014 13:36:06.575116 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.575401 master-2 kubenswrapper[4762]: I1014 13:36:06.575367 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.575486 master-2 kubenswrapper[4762]: I1014 13:36:06.575448 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.575561 master-2 kubenswrapper[4762]: I1014 13:36:06.575541 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.575618 master-2 kubenswrapper[4762]: I1014 13:36:06.575603 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq92w\" (UniqueName: \"kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.578014 master-2 kubenswrapper[4762]: I1014 13:36:06.577983 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.579769 master-2 kubenswrapper[4762]: I1014 13:36:06.579703 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.580384 master-2 kubenswrapper[4762]: I1014 13:36:06.580348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.583715 master-2 kubenswrapper[4762]: I1014 13:36:06.583646 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.590428 master-2 kubenswrapper[4762]: I1014 13:36:06.590375 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bkjjv"] Oct 14 13:36:06.591264 master-2 kubenswrapper[4762]: I1014 13:36:06.591178 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.600414 master-2 kubenswrapper[4762]: I1014 13:36:06.600274 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq92w\" (UniqueName: \"kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w\") pod \"ironic-db-sync-g5vgq\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:06.771553 master-2 kubenswrapper[4762]: I1014 13:36:06.771470 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:07.242327 master-2 kubenswrapper[4762]: I1014 13:36:07.242242 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-g5vgq"] Oct 14 13:36:07.247204 master-2 kubenswrapper[4762]: W1014 13:36:07.246445 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod799545f6_d5b9_428c_96a7_96aa931ed940.slice/crio-d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942 WatchSource:0}: Error finding container d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942: Status 404 returned error can't find the container with id d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942 Oct 14 13:36:07.460923 master-2 kubenswrapper[4762]: I1014 13:36:07.460865 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-g5vgq" event={"ID":"799545f6-d5b9-428c-96a7-96aa931ed940","Type":"ContainerStarted","Data":"d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942"} Oct 14 13:36:07.463503 master-2 kubenswrapper[4762]: I1014 13:36:07.463464 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkjjv" event={"ID":"cf99a3bf-4a2a-4206-9538-24b47ebd5605","Type":"ContainerStarted","Data":"5b4ec1eb2146271a8dfe20d432fe3931ffeaf6fb4f69a9a9bf901388c828b53e"} Oct 14 13:36:07.463553 master-2 kubenswrapper[4762]: I1014 13:36:07.463508 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkjjv" event={"ID":"cf99a3bf-4a2a-4206-9538-24b47ebd5605","Type":"ContainerStarted","Data":"4184caaabbc38f11bb066332ddf4df00fa4f13eb28de3734d428aae3ca2ede64"} Oct 14 13:36:07.499340 master-2 kubenswrapper[4762]: I1014 13:36:07.499196 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bkjjv" podStartSLOduration=2.499180166 podStartE2EDuration="2.499180166s" podCreationTimestamp="2025-10-14 13:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:36:07.494855081 +0000 UTC m=+1796.739014250" watchObservedRunningTime="2025-10-14 13:36:07.499180166 +0000 UTC m=+1796.743339325" Oct 14 13:36:07.568770 master-2 kubenswrapper[4762]: I1014 13:36:07.568690 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ce229d-4981-44dd-9a2a-ec048ec56a0f" path="/var/lib/kubelet/pods/03ce229d-4981-44dd-9a2a-ec048ec56a0f/volumes" Oct 14 13:36:10.499498 master-2 kubenswrapper[4762]: I1014 13:36:10.499433 4762 generic.go:334] "Generic (PLEG): container finished" podID="cf99a3bf-4a2a-4206-9538-24b47ebd5605" containerID="5b4ec1eb2146271a8dfe20d432fe3931ffeaf6fb4f69a9a9bf901388c828b53e" exitCode=0 Oct 14 13:36:10.500013 master-2 kubenswrapper[4762]: I1014 13:36:10.499506 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkjjv" event={"ID":"cf99a3bf-4a2a-4206-9538-24b47ebd5605","Type":"ContainerDied","Data":"5b4ec1eb2146271a8dfe20d432fe3931ffeaf6fb4f69a9a9bf901388c828b53e"} Oct 14 13:36:12.659474 master-2 kubenswrapper[4762]: I1014 13:36:12.659389 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:36:12.663238 master-2 kubenswrapper[4762]: I1014 13:36:12.662129 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.664950 master-2 kubenswrapper[4762]: I1014 13:36:12.664907 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 14 13:36:12.687568 master-2 kubenswrapper[4762]: I1014 13:36:12.687511 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:36:12.789586 master-2 kubenswrapper[4762]: I1014 13:36:12.789529 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8z98\" (UniqueName: \"kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.789586 master-2 kubenswrapper[4762]: I1014 13:36:12.789601 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.789869 master-2 kubenswrapper[4762]: I1014 13:36:12.789660 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.789869 master-2 kubenswrapper[4762]: I1014 13:36:12.789685 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.789869 master-2 kubenswrapper[4762]: I1014 13:36:12.789714 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.789869 master-2 kubenswrapper[4762]: I1014 13:36:12.789733 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892412 master-2 kubenswrapper[4762]: I1014 13:36:12.892302 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8z98\" (UniqueName: \"kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892684 master-2 kubenswrapper[4762]: I1014 13:36:12.892512 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892684 master-2 kubenswrapper[4762]: I1014 13:36:12.892615 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892684 master-2 kubenswrapper[4762]: I1014 13:36:12.892671 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892800 master-2 kubenswrapper[4762]: I1014 13:36:12.892749 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.892835 master-2 kubenswrapper[4762]: I1014 13:36:12.892801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.893758 master-2 kubenswrapper[4762]: I1014 13:36:12.893438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.893758 master-2 kubenswrapper[4762]: I1014 13:36:12.893709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.893994 master-2 kubenswrapper[4762]: I1014 13:36:12.893966 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.894591 master-2 kubenswrapper[4762]: I1014 13:36:12.894532 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.896234 master-2 kubenswrapper[4762]: I1014 13:36:12.896189 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.926677 master-2 kubenswrapper[4762]: I1014 13:36:12.926616 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8z98\" (UniqueName: \"kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98\") pod \"dnsmasq-dns-787cbbf4dc-r2lgm\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:12.990020 master-2 kubenswrapper[4762]: I1014 13:36:12.989966 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:14.325785 master-2 kubenswrapper[4762]: I1014 13:36:14.325633 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:14.517076 master-2 kubenswrapper[4762]: I1014 13:36:14.516945 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.517076 master-2 kubenswrapper[4762]: I1014 13:36:14.517025 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckncp\" (UniqueName: \"kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.517076 master-2 kubenswrapper[4762]: I1014 13:36:14.517050 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.517385 master-2 kubenswrapper[4762]: I1014 13:36:14.517084 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.517385 master-2 kubenswrapper[4762]: I1014 13:36:14.517130 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.517385 master-2 kubenswrapper[4762]: I1014 13:36:14.517185 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts\") pod \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\" (UID: \"cf99a3bf-4a2a-4206-9538-24b47ebd5605\") " Oct 14 13:36:14.519995 master-2 kubenswrapper[4762]: I1014 13:36:14.519948 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts" (OuterVolumeSpecName: "scripts") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:14.520452 master-2 kubenswrapper[4762]: I1014 13:36:14.520422 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:14.520854 master-2 kubenswrapper[4762]: I1014 13:36:14.520822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:14.530793 master-2 kubenswrapper[4762]: I1014 13:36:14.530750 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkjjv" event={"ID":"cf99a3bf-4a2a-4206-9538-24b47ebd5605","Type":"ContainerDied","Data":"4184caaabbc38f11bb066332ddf4df00fa4f13eb28de3734d428aae3ca2ede64"} Oct 14 13:36:14.530793 master-2 kubenswrapper[4762]: I1014 13:36:14.530794 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4184caaabbc38f11bb066332ddf4df00fa4f13eb28de3734d428aae3ca2ede64" Oct 14 13:36:14.530793 master-2 kubenswrapper[4762]: I1014 13:36:14.530799 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkjjv" Oct 14 13:36:14.535412 master-2 kubenswrapper[4762]: I1014 13:36:14.535360 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp" (OuterVolumeSpecName: "kube-api-access-ckncp") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "kube-api-access-ckncp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:14.536168 master-2 kubenswrapper[4762]: I1014 13:36:14.536128 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:14.538858 master-2 kubenswrapper[4762]: I1014 13:36:14.538825 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data" (OuterVolumeSpecName: "config-data") pod "cf99a3bf-4a2a-4206-9538-24b47ebd5605" (UID: "cf99a3bf-4a2a-4206-9538-24b47ebd5605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:14.619766 master-2 kubenswrapper[4762]: I1014 13:36:14.619707 4762 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-credential-keys\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:14.619766 master-2 kubenswrapper[4762]: I1014 13:36:14.619761 4762 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-fernet-keys\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:14.619766 master-2 kubenswrapper[4762]: I1014 13:36:14.619775 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:14.620100 master-2 kubenswrapper[4762]: I1014 13:36:14.619787 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:14.620100 master-2 kubenswrapper[4762]: I1014 13:36:14.619800 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckncp\" (UniqueName: \"kubernetes.io/projected/cf99a3bf-4a2a-4206-9538-24b47ebd5605-kube-api-access-ckncp\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:14.620100 master-2 kubenswrapper[4762]: I1014 13:36:14.619817 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf99a3bf-4a2a-4206-9538-24b47ebd5605-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:15.509176 master-2 kubenswrapper[4762]: I1014 13:36:15.509052 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5755976884-tp9vz"] Oct 14 13:36:15.509701 master-2 kubenswrapper[4762]: E1014 13:36:15.509450 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf99a3bf-4a2a-4206-9538-24b47ebd5605" containerName="keystone-bootstrap" Oct 14 13:36:15.509701 master-2 kubenswrapper[4762]: I1014 13:36:15.509469 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf99a3bf-4a2a-4206-9538-24b47ebd5605" containerName="keystone-bootstrap" Oct 14 13:36:15.509774 master-2 kubenswrapper[4762]: I1014 13:36:15.509704 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf99a3bf-4a2a-4206-9538-24b47ebd5605" containerName="keystone-bootstrap" Oct 14 13:36:15.510703 master-2 kubenswrapper[4762]: I1014 13:36:15.510678 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.514632 master-2 kubenswrapper[4762]: I1014 13:36:15.514548 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 14 13:36:15.515249 master-2 kubenswrapper[4762]: I1014 13:36:15.515212 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 14 13:36:15.515385 master-2 kubenswrapper[4762]: I1014 13:36:15.515337 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 14 13:36:15.515493 master-2 kubenswrapper[4762]: I1014 13:36:15.515471 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 14 13:36:15.520220 master-2 kubenswrapper[4762]: I1014 13:36:15.516935 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 14 13:36:15.544223 master-2 kubenswrapper[4762]: I1014 13:36:15.544116 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5755976884-tp9vz"] Oct 14 13:36:15.635188 master-2 kubenswrapper[4762]: I1014 13:36:15.634869 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-scripts\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635469 master-2 kubenswrapper[4762]: I1014 13:36:15.635228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-config-data\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635469 master-2 kubenswrapper[4762]: I1014 13:36:15.635281 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-combined-ca-bundle\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635469 master-2 kubenswrapper[4762]: I1014 13:36:15.635309 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-public-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635469 master-2 kubenswrapper[4762]: I1014 13:36:15.635385 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-fernet-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635469 master-2 kubenswrapper[4762]: I1014 13:36:15.635439 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcj2m\" (UniqueName: \"kubernetes.io/projected/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-kube-api-access-lcj2m\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635703 master-2 kubenswrapper[4762]: I1014 13:36:15.635477 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-internal-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.635829 master-2 kubenswrapper[4762]: I1014 13:36:15.635753 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-credential-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.738464 master-2 kubenswrapper[4762]: I1014 13:36:15.738162 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-combined-ca-bundle\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739297 master-2 kubenswrapper[4762]: I1014 13:36:15.739251 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-public-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739430 master-2 kubenswrapper[4762]: I1014 13:36:15.739313 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-fernet-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739430 master-2 kubenswrapper[4762]: I1014 13:36:15.739384 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcj2m\" (UniqueName: \"kubernetes.io/projected/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-kube-api-access-lcj2m\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739935 master-2 kubenswrapper[4762]: I1014 13:36:15.739442 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-internal-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739935 master-2 kubenswrapper[4762]: I1014 13:36:15.739520 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-credential-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739935 master-2 kubenswrapper[4762]: I1014 13:36:15.739558 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-scripts\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.739935 master-2 kubenswrapper[4762]: I1014 13:36:15.739592 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-config-data\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.743767 master-2 kubenswrapper[4762]: I1014 13:36:15.743721 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-credential-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.744779 master-2 kubenswrapper[4762]: I1014 13:36:15.744732 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-public-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.745357 master-2 kubenswrapper[4762]: I1014 13:36:15.745238 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-combined-ca-bundle\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.745578 master-2 kubenswrapper[4762]: I1014 13:36:15.745543 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-config-data\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.746465 master-2 kubenswrapper[4762]: I1014 13:36:15.746426 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-scripts\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.748732 master-2 kubenswrapper[4762]: I1014 13:36:15.748671 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-fernet-keys\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.751846 master-2 kubenswrapper[4762]: I1014 13:36:15.751789 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-internal-tls-certs\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:15.877911 master-2 kubenswrapper[4762]: I1014 13:36:15.877732 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcj2m\" (UniqueName: \"kubernetes.io/projected/2d59a3b2-1bbf-4e5b-985a-3dd42d853be4-kube-api-access-lcj2m\") pod \"keystone-5755976884-tp9vz\" (UID: \"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4\") " pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:16.136201 master-2 kubenswrapper[4762]: I1014 13:36:16.135857 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:16.380690 master-2 kubenswrapper[4762]: I1014 13:36:16.380650 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:36:16.417708 master-2 kubenswrapper[4762]: W1014 13:36:16.417662 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b8400ee_6062_47fd_b45c_bc82fbdc92cd.slice/crio-fa5aa75e15b64a833142483e0933d218d662a135a96029d33b71d535304168e5 WatchSource:0}: Error finding container fa5aa75e15b64a833142483e0933d218d662a135a96029d33b71d535304168e5: Status 404 returned error can't find the container with id fa5aa75e15b64a833142483e0933d218d662a135a96029d33b71d535304168e5 Oct 14 13:36:16.559208 master-2 kubenswrapper[4762]: I1014 13:36:16.558828 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5755976884-tp9vz"] Oct 14 13:36:16.559950 master-2 kubenswrapper[4762]: I1014 13:36:16.559649 4762 generic.go:334] "Generic (PLEG): container finished" podID="799545f6-d5b9-428c-96a7-96aa931ed940" containerID="280aab5216733f8c50d357878f94bee2a7447933626bc9870ea4aa1c9fac74a5" exitCode=0 Oct 14 13:36:16.559950 master-2 kubenswrapper[4762]: I1014 13:36:16.559711 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-g5vgq" event={"ID":"799545f6-d5b9-428c-96a7-96aa931ed940","Type":"ContainerDied","Data":"280aab5216733f8c50d357878f94bee2a7447933626bc9870ea4aa1c9fac74a5"} Oct 14 13:36:16.561499 master-2 kubenswrapper[4762]: W1014 13:36:16.561440 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d59a3b2_1bbf_4e5b_985a_3dd42d853be4.slice/crio-c8b3c5469815daef3c9f4a45c913034583457aa55a708e05b85bd936a0f5c2fb WatchSource:0}: Error finding container c8b3c5469815daef3c9f4a45c913034583457aa55a708e05b85bd936a0f5c2fb: Status 404 returned error can't find the container with id c8b3c5469815daef3c9f4a45c913034583457aa55a708e05b85bd936a0f5c2fb Oct 14 13:36:16.562176 master-2 kubenswrapper[4762]: I1014 13:36:16.562063 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" event={"ID":"3b8400ee-6062-47fd-b45c-bc82fbdc92cd","Type":"ContainerStarted","Data":"fa5aa75e15b64a833142483e0933d218d662a135a96029d33b71d535304168e5"} Oct 14 13:36:16.567893 master-2 kubenswrapper[4762]: I1014 13:36:16.567856 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerStarted","Data":"bdffe4089ba2fb584e5e5776f0e72566742d32fe62bdce48e26910994d6e6a19"} Oct 14 13:36:16.568074 master-2 kubenswrapper[4762]: I1014 13:36:16.568040 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-central-agent" containerID="cri-o://638bce9e30d01056c9bf0af37a0bd392f3f2675ef3ca7ea5aed5ae76cf0225e4" gracePeriod=30 Oct 14 13:36:16.568127 master-2 kubenswrapper[4762]: I1014 13:36:16.568100 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="sg-core" containerID="cri-o://bbfd93e2ee92e007c9a4e57ddc4c0b70020d7466986d1c3c5670d6b2d4304fb8" gracePeriod=30 Oct 14 13:36:16.568127 master-2 kubenswrapper[4762]: I1014 13:36:16.568112 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:36:16.568223 master-2 kubenswrapper[4762]: I1014 13:36:16.568193 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="proxy-httpd" containerID="cri-o://bdffe4089ba2fb584e5e5776f0e72566742d32fe62bdce48e26910994d6e6a19" gracePeriod=30 Oct 14 13:36:16.568284 master-2 kubenswrapper[4762]: I1014 13:36:16.568235 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-notification-agent" containerID="cri-o://15c265e51aed24697243505daa7e2d8b6b252c7cf9d56345c510ab502e3cda27" gracePeriod=30 Oct 14 13:36:16.629381 master-2 kubenswrapper[4762]: I1014 13:36:16.629283 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.830762583 podStartE2EDuration="25.629261047s" podCreationTimestamp="2025-10-14 13:35:51 +0000 UTC" firstStartedPulling="2025-10-14 13:35:52.185137314 +0000 UTC m=+1781.429296473" lastFinishedPulling="2025-10-14 13:36:15.983635778 +0000 UTC m=+1805.227794937" observedRunningTime="2025-10-14 13:36:16.621453073 +0000 UTC m=+1805.865612252" watchObservedRunningTime="2025-10-14 13:36:16.629261047 +0000 UTC m=+1805.873420216" Oct 14 13:36:17.579472 master-2 kubenswrapper[4762]: I1014 13:36:17.579313 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5755976884-tp9vz" event={"ID":"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4","Type":"ContainerStarted","Data":"50e796442d69f630d56a73817a7da7cf3963f858d90ee2e36c4d63ee88cc5898"} Oct 14 13:36:17.579472 master-2 kubenswrapper[4762]: I1014 13:36:17.579391 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5755976884-tp9vz" event={"ID":"2d59a3b2-1bbf-4e5b-985a-3dd42d853be4","Type":"ContainerStarted","Data":"c8b3c5469815daef3c9f4a45c913034583457aa55a708e05b85bd936a0f5c2fb"} Oct 14 13:36:17.580206 master-2 kubenswrapper[4762]: I1014 13:36:17.579479 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:17.582557 master-2 kubenswrapper[4762]: I1014 13:36:17.582512 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-g5vgq" event={"ID":"799545f6-d5b9-428c-96a7-96aa931ed940","Type":"ContainerStarted","Data":"a74426201541067f0f49cc6a155bebc5ae88678dbd1e141645381389a47e7563"} Oct 14 13:36:17.584948 master-2 kubenswrapper[4762]: I1014 13:36:17.584903 4762 generic.go:334] "Generic (PLEG): container finished" podID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerID="7f52e6b67cea1c91bd8d3f43ce53c9e240c33f16b685a14076364cdf4dca23da" exitCode=0 Oct 14 13:36:17.585194 master-2 kubenswrapper[4762]: I1014 13:36:17.584981 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" event={"ID":"3b8400ee-6062-47fd-b45c-bc82fbdc92cd","Type":"ContainerDied","Data":"7f52e6b67cea1c91bd8d3f43ce53c9e240c33f16b685a14076364cdf4dca23da"} Oct 14 13:36:17.595668 master-2 kubenswrapper[4762]: I1014 13:36:17.595602 4762 generic.go:334] "Generic (PLEG): container finished" podID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerID="bdffe4089ba2fb584e5e5776f0e72566742d32fe62bdce48e26910994d6e6a19" exitCode=0 Oct 14 13:36:17.595823 master-2 kubenswrapper[4762]: I1014 13:36:17.595799 4762 generic.go:334] "Generic (PLEG): container finished" podID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerID="bbfd93e2ee92e007c9a4e57ddc4c0b70020d7466986d1c3c5670d6b2d4304fb8" exitCode=2 Oct 14 13:36:17.595944 master-2 kubenswrapper[4762]: I1014 13:36:17.595922 4762 generic.go:334] "Generic (PLEG): container finished" podID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerID="638bce9e30d01056c9bf0af37a0bd392f3f2675ef3ca7ea5aed5ae76cf0225e4" exitCode=0 Oct 14 13:36:17.596054 master-2 kubenswrapper[4762]: I1014 13:36:17.595734 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerDied","Data":"bdffe4089ba2fb584e5e5776f0e72566742d32fe62bdce48e26910994d6e6a19"} Oct 14 13:36:17.596245 master-2 kubenswrapper[4762]: I1014 13:36:17.596219 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerDied","Data":"bbfd93e2ee92e007c9a4e57ddc4c0b70020d7466986d1c3c5670d6b2d4304fb8"} Oct 14 13:36:17.596370 master-2 kubenswrapper[4762]: I1014 13:36:17.596349 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerDied","Data":"638bce9e30d01056c9bf0af37a0bd392f3f2675ef3ca7ea5aed5ae76cf0225e4"} Oct 14 13:36:17.627449 master-2 kubenswrapper[4762]: I1014 13:36:17.627367 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5755976884-tp9vz" podStartSLOduration=2.627339932 podStartE2EDuration="2.627339932s" podCreationTimestamp="2025-10-14 13:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:36:17.622575194 +0000 UTC m=+1806.866734423" watchObservedRunningTime="2025-10-14 13:36:17.627339932 +0000 UTC m=+1806.871499121" Oct 14 13:36:17.683984 master-2 kubenswrapper[4762]: I1014 13:36:17.683891 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-g5vgq" podStartSLOduration=2.983857103 podStartE2EDuration="11.683873595s" podCreationTimestamp="2025-10-14 13:36:06 +0000 UTC" firstStartedPulling="2025-10-14 13:36:07.249953816 +0000 UTC m=+1796.494112975" lastFinishedPulling="2025-10-14 13:36:15.949970268 +0000 UTC m=+1805.194129467" observedRunningTime="2025-10-14 13:36:17.67952679 +0000 UTC m=+1806.923685979" watchObservedRunningTime="2025-10-14 13:36:17.683873595 +0000 UTC m=+1806.928032744" Oct 14 13:36:18.606763 master-2 kubenswrapper[4762]: I1014 13:36:18.606626 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" event={"ID":"3b8400ee-6062-47fd-b45c-bc82fbdc92cd","Type":"ContainerStarted","Data":"e0f1acf8e74c67b52d78ce1b9f7d41c464bee5c142ef00e5bb5cef8bcd42d9a5"} Oct 14 13:36:18.607793 master-2 kubenswrapper[4762]: I1014 13:36:18.607217 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:18.647600 master-2 kubenswrapper[4762]: I1014 13:36:18.647536 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" podStartSLOduration=6.647519678 podStartE2EDuration="6.647519678s" podCreationTimestamp="2025-10-14 13:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:36:18.644089581 +0000 UTC m=+1807.888248740" watchObservedRunningTime="2025-10-14 13:36:18.647519678 +0000 UTC m=+1807.891678837" Oct 14 13:36:19.618263 master-2 kubenswrapper[4762]: I1014 13:36:19.617273 4762 generic.go:334] "Generic (PLEG): container finished" podID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerID="15c265e51aed24697243505daa7e2d8b6b252c7cf9d56345c510ab502e3cda27" exitCode=0 Oct 14 13:36:19.618263 master-2 kubenswrapper[4762]: I1014 13:36:19.617369 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerDied","Data":"15c265e51aed24697243505daa7e2d8b6b252c7cf9d56345c510ab502e3cda27"} Oct 14 13:36:19.991590 master-2 kubenswrapper[4762]: I1014 13:36:19.991542 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:36:20.119243 master-2 kubenswrapper[4762]: I1014 13:36:20.119136 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.119546 master-2 kubenswrapper[4762]: I1014 13:36:20.119504 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:20.119622 master-2 kubenswrapper[4762]: I1014 13:36:20.119606 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.119713 master-2 kubenswrapper[4762]: I1014 13:36:20.119702 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.119866 master-2 kubenswrapper[4762]: I1014 13:36:20.119852 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwmv\" (UniqueName: \"kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.120028 master-2 kubenswrapper[4762]: I1014 13:36:20.120014 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.120175 master-2 kubenswrapper[4762]: I1014 13:36:20.120151 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.120278 master-2 kubenswrapper[4762]: I1014 13:36:20.120264 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts\") pod \"520f5fd5-5c4b-4781-bc3f-363b18244e36\" (UID: \"520f5fd5-5c4b-4781-bc3f-363b18244e36\") " Oct 14 13:36:20.120889 master-2 kubenswrapper[4762]: I1014 13:36:20.120875 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.120989 master-2 kubenswrapper[4762]: I1014 13:36:20.120941 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:20.123638 master-2 kubenswrapper[4762]: I1014 13:36:20.123588 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv" (OuterVolumeSpecName: "kube-api-access-nxwmv") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "kube-api-access-nxwmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:20.123936 master-2 kubenswrapper[4762]: I1014 13:36:20.123903 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts" (OuterVolumeSpecName: "scripts") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:20.152350 master-2 kubenswrapper[4762]: I1014 13:36:20.152296 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:20.173405 master-2 kubenswrapper[4762]: I1014 13:36:20.173348 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:20.201559 master-2 kubenswrapper[4762]: I1014 13:36:20.201496 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data" (OuterVolumeSpecName: "config-data") pod "520f5fd5-5c4b-4781-bc3f-363b18244e36" (UID: "520f5fd5-5c4b-4781-bc3f-363b18244e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223352 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/520f5fd5-5c4b-4781-bc3f-363b18244e36-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223402 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223419 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223430 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223443 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwmv\" (UniqueName: \"kubernetes.io/projected/520f5fd5-5c4b-4781-bc3f-363b18244e36-kube-api-access-nxwmv\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.223429 master-2 kubenswrapper[4762]: I1014 13:36:20.223645 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520f5fd5-5c4b-4781-bc3f-363b18244e36-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:20.267692 master-2 kubenswrapper[4762]: I1014 13:36:20.267631 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-958c54db4-kctqr"] Oct 14 13:36:20.268018 master-2 kubenswrapper[4762]: E1014 13:36:20.267982 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="proxy-httpd" Oct 14 13:36:20.268018 master-2 kubenswrapper[4762]: I1014 13:36:20.268003 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="proxy-httpd" Oct 14 13:36:20.268114 master-2 kubenswrapper[4762]: E1014 13:36:20.268062 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-central-agent" Oct 14 13:36:20.268114 master-2 kubenswrapper[4762]: I1014 13:36:20.268072 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-central-agent" Oct 14 13:36:20.268197 master-2 kubenswrapper[4762]: E1014 13:36:20.268124 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-notification-agent" Oct 14 13:36:20.268197 master-2 kubenswrapper[4762]: I1014 13:36:20.268134 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-notification-agent" Oct 14 13:36:20.268197 master-2 kubenswrapper[4762]: E1014 13:36:20.268169 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="sg-core" Oct 14 13:36:20.268197 master-2 kubenswrapper[4762]: I1014 13:36:20.268177 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="sg-core" Oct 14 13:36:20.268343 master-2 kubenswrapper[4762]: I1014 13:36:20.268319 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="proxy-httpd" Oct 14 13:36:20.268343 master-2 kubenswrapper[4762]: I1014 13:36:20.268342 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-central-agent" Oct 14 13:36:20.268412 master-2 kubenswrapper[4762]: I1014 13:36:20.268355 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="sg-core" Oct 14 13:36:20.268412 master-2 kubenswrapper[4762]: I1014 13:36:20.268369 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" containerName="ceilometer-notification-agent" Oct 14 13:36:20.284937 master-2 kubenswrapper[4762]: I1014 13:36:20.284860 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.288003 master-2 kubenswrapper[4762]: I1014 13:36:20.287914 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 14 13:36:20.288337 master-2 kubenswrapper[4762]: I1014 13:36:20.288309 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 14 13:36:20.288577 master-2 kubenswrapper[4762]: I1014 13:36:20.288554 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 14 13:36:20.288987 master-2 kubenswrapper[4762]: I1014 13:36:20.288839 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 14 13:36:20.295300 master-2 kubenswrapper[4762]: I1014 13:36:20.295254 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-958c54db4-kctqr"] Oct 14 13:36:20.325376 master-2 kubenswrapper[4762]: I1014 13:36:20.325311 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-combined-ca-bundle\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325614 master-2 kubenswrapper[4762]: I1014 13:36:20.325422 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-public-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325614 master-2 kubenswrapper[4762]: I1014 13:36:20.325446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-scripts\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325614 master-2 kubenswrapper[4762]: I1014 13:36:20.325471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4df40aa-26fa-4df4-98cd-179aec30dc77-logs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325614 master-2 kubenswrapper[4762]: I1014 13:36:20.325536 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-internal-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325614 master-2 kubenswrapper[4762]: I1014 13:36:20.325593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-config-data\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.325817 master-2 kubenswrapper[4762]: I1014 13:36:20.325618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wt9\" (UniqueName: \"kubernetes.io/projected/b4df40aa-26fa-4df4-98cd-179aec30dc77-kube-api-access-k6wt9\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427086 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-config-data\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427144 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wt9\" (UniqueName: \"kubernetes.io/projected/b4df40aa-26fa-4df4-98cd-179aec30dc77-kube-api-access-k6wt9\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427182 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-combined-ca-bundle\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427238 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-scripts\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427257 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-public-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427278 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4df40aa-26fa-4df4-98cd-179aec30dc77-logs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430190 master-2 kubenswrapper[4762]: I1014 13:36:20.427333 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-internal-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.430722 master-2 kubenswrapper[4762]: I1014 13:36:20.430292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-internal-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.432555 master-2 kubenswrapper[4762]: I1014 13:36:20.432504 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4df40aa-26fa-4df4-98cd-179aec30dc77-logs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.433641 master-2 kubenswrapper[4762]: I1014 13:36:20.433576 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-scripts\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.435440 master-2 kubenswrapper[4762]: I1014 13:36:20.435397 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-config-data\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.436744 master-2 kubenswrapper[4762]: I1014 13:36:20.436708 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-public-tls-certs\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.438011 master-2 kubenswrapper[4762]: I1014 13:36:20.437944 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4df40aa-26fa-4df4-98cd-179aec30dc77-combined-ca-bundle\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.446637 master-2 kubenswrapper[4762]: I1014 13:36:20.446597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wt9\" (UniqueName: \"kubernetes.io/projected/b4df40aa-26fa-4df4-98cd-179aec30dc77-kube-api-access-k6wt9\") pod \"placement-958c54db4-kctqr\" (UID: \"b4df40aa-26fa-4df4-98cd-179aec30dc77\") " pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.612131 master-2 kubenswrapper[4762]: I1014 13:36:20.610772 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:20.635686 master-2 kubenswrapper[4762]: I1014 13:36:20.635551 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:36:20.636144 master-2 kubenswrapper[4762]: I1014 13:36:20.635987 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"520f5fd5-5c4b-4781-bc3f-363b18244e36","Type":"ContainerDied","Data":"34a9eea30f534c6dca6a675c4ae28f647461a219c6d471c0895a97d54fad7d53"} Oct 14 13:36:20.636144 master-2 kubenswrapper[4762]: I1014 13:36:20.636045 4762 scope.go:117] "RemoveContainer" containerID="bdffe4089ba2fb584e5e5776f0e72566742d32fe62bdce48e26910994d6e6a19" Oct 14 13:36:20.729196 master-2 kubenswrapper[4762]: I1014 13:36:20.729135 4762 scope.go:117] "RemoveContainer" containerID="bbfd93e2ee92e007c9a4e57ddc4c0b70020d7466986d1c3c5670d6b2d4304fb8" Oct 14 13:36:20.750959 master-2 kubenswrapper[4762]: I1014 13:36:20.750910 4762 scope.go:117] "RemoveContainer" containerID="15c265e51aed24697243505daa7e2d8b6b252c7cf9d56345c510ab502e3cda27" Oct 14 13:36:20.773500 master-2 kubenswrapper[4762]: I1014 13:36:20.773437 4762 scope.go:117] "RemoveContainer" containerID="638bce9e30d01056c9bf0af37a0bd392f3f2675ef3ca7ea5aed5ae76cf0225e4" Oct 14 13:36:21.322952 master-2 kubenswrapper[4762]: I1014 13:36:21.322851 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:21.323830 master-2 kubenswrapper[4762]: W1014 13:36:21.323794 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4df40aa_26fa_4df4_98cd_179aec30dc77.slice/crio-b2bd0531cfe2b2d8c191136e241711c07a324214070bea5af4d38e2459ac992f WatchSource:0}: Error finding container b2bd0531cfe2b2d8c191136e241711c07a324214070bea5af4d38e2459ac992f: Status 404 returned error can't find the container with id b2bd0531cfe2b2d8c191136e241711c07a324214070bea5af4d38e2459ac992f Oct 14 13:36:21.330657 master-2 kubenswrapper[4762]: I1014 13:36:21.330127 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-958c54db4-kctqr"] Oct 14 13:36:21.414538 master-2 kubenswrapper[4762]: I1014 13:36:21.414479 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:21.559002 master-2 kubenswrapper[4762]: I1014 13:36:21.558917 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520f5fd5-5c4b-4781-bc3f-363b18244e36" path="/var/lib/kubelet/pods/520f5fd5-5c4b-4781-bc3f-363b18244e36/volumes" Oct 14 13:36:21.644561 master-2 kubenswrapper[4762]: I1014 13:36:21.644388 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-958c54db4-kctqr" event={"ID":"b4df40aa-26fa-4df4-98cd-179aec30dc77","Type":"ContainerStarted","Data":"b2bd0531cfe2b2d8c191136e241711c07a324214070bea5af4d38e2459ac992f"} Oct 14 13:36:21.660904 master-2 kubenswrapper[4762]: I1014 13:36:21.660839 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:21.665224 master-2 kubenswrapper[4762]: I1014 13:36:21.663884 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:36:21.666907 master-2 kubenswrapper[4762]: I1014 13:36:21.666858 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:36:21.667010 master-2 kubenswrapper[4762]: I1014 13:36:21.666942 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:36:21.781509 master-2 kubenswrapper[4762]: I1014 13:36:21.781443 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:21.852798 master-2 kubenswrapper[4762]: I1014 13:36:21.852715 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlr44\" (UniqueName: \"kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853070 master-2 kubenswrapper[4762]: I1014 13:36:21.852811 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853070 master-2 kubenswrapper[4762]: I1014 13:36:21.852852 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853070 master-2 kubenswrapper[4762]: I1014 13:36:21.852910 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853070 master-2 kubenswrapper[4762]: I1014 13:36:21.852936 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853070 master-2 kubenswrapper[4762]: I1014 13:36:21.852978 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.853312 master-2 kubenswrapper[4762]: I1014 13:36:21.853139 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.954760 master-2 kubenswrapper[4762]: I1014 13:36:21.954706 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954782 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954807 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954833 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954934 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlr44\" (UniqueName: \"kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955025 master-2 kubenswrapper[4762]: I1014 13:36:21.954957 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955329 master-2 kubenswrapper[4762]: I1014 13:36:21.955294 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.955496 master-2 kubenswrapper[4762]: I1014 13:36:21.955444 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.958657 master-2 kubenswrapper[4762]: I1014 13:36:21.958594 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.958871 master-2 kubenswrapper[4762]: I1014 13:36:21.958813 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.959087 master-2 kubenswrapper[4762]: I1014 13:36:21.959049 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:21.959134 master-2 kubenswrapper[4762]: I1014 13:36:21.959099 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:22.000634 master-2 kubenswrapper[4762]: I1014 13:36:22.000539 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlr44\" (UniqueName: \"kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44\") pod \"ceilometer-0\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " pod="openstack/ceilometer-0" Oct 14 13:36:22.278891 master-2 kubenswrapper[4762]: I1014 13:36:22.278827 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:36:22.901170 master-2 kubenswrapper[4762]: I1014 13:36:22.900932 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:22.993464 master-2 kubenswrapper[4762]: I1014 13:36:22.993397 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:36:23.825379 master-2 kubenswrapper[4762]: I1014 13:36:23.825254 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:36:23.825557 master-2 kubenswrapper[4762]: I1014 13:36:23.825536 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="dnsmasq-dns" containerID="cri-o://f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd" gracePeriod=10 Oct 14 13:36:24.277949 master-2 kubenswrapper[4762]: W1014 13:36:24.277872 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcec9fba4_1a28_472b_976b_3fd3d9367772.slice/crio-de0575b318761da189ae063cc60ae01f477a71cc86d4e0826bcbdf927520f730 WatchSource:0}: Error finding container de0575b318761da189ae063cc60ae01f477a71cc86d4e0826bcbdf927520f730: Status 404 returned error can't find the container with id de0575b318761da189ae063cc60ae01f477a71cc86d4e0826bcbdf927520f730 Oct 14 13:36:24.620289 master-2 kubenswrapper[4762]: I1014 13:36:24.620100 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:36:24.670458 master-2 kubenswrapper[4762]: I1014 13:36:24.670395 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerStarted","Data":"de0575b318761da189ae063cc60ae01f477a71cc86d4e0826bcbdf927520f730"} Oct 14 13:36:24.672984 master-2 kubenswrapper[4762]: I1014 13:36:24.672929 4762 generic.go:334] "Generic (PLEG): container finished" podID="90052432-150d-4d93-b058-513adf55e099" containerID="f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd" exitCode=0 Oct 14 13:36:24.673061 master-2 kubenswrapper[4762]: I1014 13:36:24.672989 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" Oct 14 13:36:24.673061 master-2 kubenswrapper[4762]: I1014 13:36:24.672982 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" event={"ID":"90052432-150d-4d93-b058-513adf55e099","Type":"ContainerDied","Data":"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd"} Oct 14 13:36:24.673061 master-2 kubenswrapper[4762]: I1014 13:36:24.673049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f9c67449-jgsnj" event={"ID":"90052432-150d-4d93-b058-513adf55e099","Type":"ContainerDied","Data":"403a1761b43668a95715d9e22a51cc31bd9ed8190825b29922ab14e99825ee55"} Oct 14 13:36:24.673189 master-2 kubenswrapper[4762]: I1014 13:36:24.673075 4762 scope.go:117] "RemoveContainer" containerID="f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd" Oct 14 13:36:24.682082 master-2 kubenswrapper[4762]: I1014 13:36:24.682025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-958c54db4-kctqr" event={"ID":"b4df40aa-26fa-4df4-98cd-179aec30dc77","Type":"ContainerStarted","Data":"2b5346724f000656d920288bb0d0934674c61edea6e6de38a47d719bf70dda7d"} Oct 14 13:36:24.716213 master-2 kubenswrapper[4762]: I1014 13:36:24.716168 4762 scope.go:117] "RemoveContainer" containerID="f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773" Oct 14 13:36:24.724887 master-2 kubenswrapper[4762]: I1014 13:36:24.724843 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb\") pod \"90052432-150d-4d93-b058-513adf55e099\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " Oct 14 13:36:24.725165 master-2 kubenswrapper[4762]: I1014 13:36:24.724916 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb\") pod \"90052432-150d-4d93-b058-513adf55e099\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " Oct 14 13:36:24.725165 master-2 kubenswrapper[4762]: I1014 13:36:24.724979 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config\") pod \"90052432-150d-4d93-b058-513adf55e099\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " Oct 14 13:36:24.725165 master-2 kubenswrapper[4762]: I1014 13:36:24.725046 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc\") pod \"90052432-150d-4d93-b058-513adf55e099\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " Oct 14 13:36:24.725165 master-2 kubenswrapper[4762]: I1014 13:36:24.725111 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swzvd\" (UniqueName: \"kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd\") pod \"90052432-150d-4d93-b058-513adf55e099\" (UID: \"90052432-150d-4d93-b058-513adf55e099\") " Oct 14 13:36:24.728721 master-2 kubenswrapper[4762]: I1014 13:36:24.728669 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd" (OuterVolumeSpecName: "kube-api-access-swzvd") pod "90052432-150d-4d93-b058-513adf55e099" (UID: "90052432-150d-4d93-b058-513adf55e099"). InnerVolumeSpecName "kube-api-access-swzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:24.751890 master-2 kubenswrapper[4762]: I1014 13:36:24.751853 4762 scope.go:117] "RemoveContainer" containerID="f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd" Oct 14 13:36:24.752412 master-2 kubenswrapper[4762]: E1014 13:36:24.752362 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd\": container with ID starting with f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd not found: ID does not exist" containerID="f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd" Oct 14 13:36:24.752487 master-2 kubenswrapper[4762]: I1014 13:36:24.752417 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd"} err="failed to get container status \"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd\": rpc error: code = NotFound desc = could not find container \"f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd\": container with ID starting with f5f90c5018bb72c8e911764ba8f30e8b3c88ad35e1979a44493253c47e06b4fd not found: ID does not exist" Oct 14 13:36:24.752487 master-2 kubenswrapper[4762]: I1014 13:36:24.752455 4762 scope.go:117] "RemoveContainer" containerID="f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773" Oct 14 13:36:24.752909 master-2 kubenswrapper[4762]: E1014 13:36:24.752859 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773\": container with ID starting with f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773 not found: ID does not exist" containerID="f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773" Oct 14 13:36:24.752909 master-2 kubenswrapper[4762]: I1014 13:36:24.752888 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773"} err="failed to get container status \"f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773\": rpc error: code = NotFound desc = could not find container \"f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773\": container with ID starting with f934ea67f5f9bf5d51232d0c052ae96553f988006487c8803463027d3e5f3773 not found: ID does not exist" Oct 14 13:36:24.759586 master-2 kubenswrapper[4762]: I1014 13:36:24.759539 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90052432-150d-4d93-b058-513adf55e099" (UID: "90052432-150d-4d93-b058-513adf55e099"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:36:24.759789 master-2 kubenswrapper[4762]: I1014 13:36:24.759717 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config" (OuterVolumeSpecName: "config") pod "90052432-150d-4d93-b058-513adf55e099" (UID: "90052432-150d-4d93-b058-513adf55e099"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:36:24.763635 master-2 kubenswrapper[4762]: I1014 13:36:24.763560 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90052432-150d-4d93-b058-513adf55e099" (UID: "90052432-150d-4d93-b058-513adf55e099"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:36:24.783857 master-2 kubenswrapper[4762]: I1014 13:36:24.783797 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90052432-150d-4d93-b058-513adf55e099" (UID: "90052432-150d-4d93-b058-513adf55e099"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:36:24.828062 master-2 kubenswrapper[4762]: I1014 13:36:24.827940 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-sb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:24.828062 master-2 kubenswrapper[4762]: I1014 13:36:24.828017 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:24.828062 master-2 kubenswrapper[4762]: I1014 13:36:24.828030 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:24.828062 master-2 kubenswrapper[4762]: I1014 13:36:24.828043 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swzvd\" (UniqueName: \"kubernetes.io/projected/90052432-150d-4d93-b058-513adf55e099-kube-api-access-swzvd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:24.828416 master-2 kubenswrapper[4762]: I1014 13:36:24.828084 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90052432-150d-4d93-b058-513adf55e099-ovsdbserver-nb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:25.442351 master-2 kubenswrapper[4762]: I1014 13:36:25.442297 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:36:25.528935 master-2 kubenswrapper[4762]: I1014 13:36:25.528857 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68f9c67449-jgsnj"] Oct 14 13:36:25.557123 master-2 kubenswrapper[4762]: I1014 13:36:25.556870 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90052432-150d-4d93-b058-513adf55e099" path="/var/lib/kubelet/pods/90052432-150d-4d93-b058-513adf55e099/volumes" Oct 14 13:36:25.691505 master-2 kubenswrapper[4762]: I1014 13:36:25.691446 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerStarted","Data":"3efc717259ce9ea1d1dd607ab8c6145fce7f07f59ec75ecc3276645b81ae5372"} Oct 14 13:36:25.691738 master-2 kubenswrapper[4762]: I1014 13:36:25.691520 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerStarted","Data":"47a8e63e4893125c500a5bc20abcb6fe8e2232eb088778319cd60a01e795d603"} Oct 14 13:36:25.694934 master-2 kubenswrapper[4762]: I1014 13:36:25.694882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-958c54db4-kctqr" event={"ID":"b4df40aa-26fa-4df4-98cd-179aec30dc77","Type":"ContainerStarted","Data":"bd8668829dd9c47d746e42d2eba2b291761c4acb9f364fb5cb1d443972845671"} Oct 14 13:36:25.695093 master-2 kubenswrapper[4762]: I1014 13:36:25.695065 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:25.972522 master-2 kubenswrapper[4762]: I1014 13:36:25.972410 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-958c54db4-kctqr" podStartSLOduration=2.965671712 podStartE2EDuration="5.97238773s" podCreationTimestamp="2025-10-14 13:36:20 +0000 UTC" firstStartedPulling="2025-10-14 13:36:21.326405785 +0000 UTC m=+1810.570564954" lastFinishedPulling="2025-10-14 13:36:24.333121813 +0000 UTC m=+1813.577280972" observedRunningTime="2025-10-14 13:36:25.965517165 +0000 UTC m=+1815.209676324" watchObservedRunningTime="2025-10-14 13:36:25.97238773 +0000 UTC m=+1815.216546899" Oct 14 13:36:26.198251 master-2 kubenswrapper[4762]: I1014 13:36:26.198197 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:36:26.198493 master-2 kubenswrapper[4762]: E1014 13:36:26.198474 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="dnsmasq-dns" Oct 14 13:36:26.198493 master-2 kubenswrapper[4762]: I1014 13:36:26.198490 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="dnsmasq-dns" Oct 14 13:36:26.198624 master-2 kubenswrapper[4762]: E1014 13:36:26.198511 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="init" Oct 14 13:36:26.198624 master-2 kubenswrapper[4762]: I1014 13:36:26.198517 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="init" Oct 14 13:36:26.198695 master-2 kubenswrapper[4762]: I1014 13:36:26.198647 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="90052432-150d-4d93-b058-513adf55e099" containerName="dnsmasq-dns" Oct 14 13:36:26.199442 master-2 kubenswrapper[4762]: I1014 13:36:26.199409 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.202096 master-2 kubenswrapper[4762]: I1014 13:36:26.202038 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 14 13:36:26.202096 master-2 kubenswrapper[4762]: I1014 13:36:26.202074 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 14 13:36:26.202300 master-2 kubenswrapper[4762]: I1014 13:36:26.202041 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 14 13:36:26.260223 master-2 kubenswrapper[4762]: I1014 13:36:26.260171 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.260330 master-2 kubenswrapper[4762]: I1014 13:36:26.260237 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrmk\" (UniqueName: \"kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.260330 master-2 kubenswrapper[4762]: I1014 13:36:26.260262 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.260330 master-2 kubenswrapper[4762]: I1014 13:36:26.260292 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.260428 master-2 kubenswrapper[4762]: I1014 13:36:26.260359 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.335308 master-2 kubenswrapper[4762]: I1014 13:36:26.335229 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:36:26.362415 master-2 kubenswrapper[4762]: I1014 13:36:26.362026 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.362415 master-2 kubenswrapper[4762]: I1014 13:36:26.362101 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrmk\" (UniqueName: \"kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.362415 master-2 kubenswrapper[4762]: I1014 13:36:26.362122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.362415 master-2 kubenswrapper[4762]: I1014 13:36:26.362210 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.362415 master-2 kubenswrapper[4762]: I1014 13:36:26.362286 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.367044 master-2 kubenswrapper[4762]: I1014 13:36:26.365976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.367573 master-2 kubenswrapper[4762]: I1014 13:36:26.367475 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.378528 master-2 kubenswrapper[4762]: I1014 13:36:26.368351 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.378528 master-2 kubenswrapper[4762]: I1014 13:36:26.369349 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.487469 master-2 kubenswrapper[4762]: I1014 13:36:26.487332 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrmk\" (UniqueName: \"kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk\") pod \"neutron-55c4fcb4cb-kwz75\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.517937 master-2 kubenswrapper[4762]: I1014 13:36:26.517876 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:26.710819 master-2 kubenswrapper[4762]: I1014 13:36:26.710776 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerStarted","Data":"0ab705a863170920c4fd266182f50ebd265e98f57ca33ed14326881c3a32c5e1"} Oct 14 13:36:26.711022 master-2 kubenswrapper[4762]: I1014 13:36:26.710829 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:27.976403 master-2 kubenswrapper[4762]: I1014 13:36:27.976360 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:36:27.980691 master-2 kubenswrapper[4762]: W1014 13:36:27.980637 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72401abc_aeab_47ed_98d0_15a765c5fb91.slice/crio-bbedd81147364d4b5f695d3a3a7f359f7004558927cd86cd501d6152757fc822 WatchSource:0}: Error finding container bbedd81147364d4b5f695d3a3a7f359f7004558927cd86cd501d6152757fc822: Status 404 returned error can't find the container with id bbedd81147364d4b5f695d3a3a7f359f7004558927cd86cd501d6152757fc822 Oct 14 13:36:28.724353 master-2 kubenswrapper[4762]: I1014 13:36:28.724274 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerStarted","Data":"9c0744ca71f8d07f85bd04be3920b608ce1b232f7d9fe4f045b0a94ad2f7307d"} Oct 14 13:36:28.724353 master-2 kubenswrapper[4762]: I1014 13:36:28.724338 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerStarted","Data":"8374fba6f74a3d2edc7fb9870774183af6268ff6a8cc668f20b43a7d5a867325"} Oct 14 13:36:28.724353 master-2 kubenswrapper[4762]: I1014 13:36:28.724349 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerStarted","Data":"bbedd81147364d4b5f695d3a3a7f359f7004558927cd86cd501d6152757fc822"} Oct 14 13:36:28.798826 master-2 kubenswrapper[4762]: I1014 13:36:28.724765 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:28.798826 master-2 kubenswrapper[4762]: I1014 13:36:28.727576 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerStarted","Data":"3dacb932f68868cabc25518a739adf2c105cb3c0cebad766785c017f886a5c76"} Oct 14 13:36:28.798826 master-2 kubenswrapper[4762]: I1014 13:36:28.727791 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:36:29.072598 master-2 kubenswrapper[4762]: I1014 13:36:29.072409 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55c4fcb4cb-kwz75" podStartSLOduration=3.072382565 podStartE2EDuration="3.072382565s" podCreationTimestamp="2025-10-14 13:36:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:36:28.908168386 +0000 UTC m=+1818.152327555" watchObservedRunningTime="2025-10-14 13:36:29.072382565 +0000 UTC m=+1818.316541774" Oct 14 13:36:30.224745 master-2 kubenswrapper[4762]: I1014 13:36:30.224604 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.6931492089999995 podStartE2EDuration="9.224573316s" podCreationTimestamp="2025-10-14 13:36:21 +0000 UTC" firstStartedPulling="2025-10-14 13:36:24.279980536 +0000 UTC m=+1813.524139695" lastFinishedPulling="2025-10-14 13:36:27.811404643 +0000 UTC m=+1817.055563802" observedRunningTime="2025-10-14 13:36:29.869902038 +0000 UTC m=+1819.114061207" watchObservedRunningTime="2025-10-14 13:36:30.224573316 +0000 UTC m=+1819.468732475" Oct 14 13:36:31.761313 master-2 kubenswrapper[4762]: I1014 13:36:31.761214 4762 generic.go:334] "Generic (PLEG): container finished" podID="799545f6-d5b9-428c-96a7-96aa931ed940" containerID="a74426201541067f0f49cc6a155bebc5ae88678dbd1e141645381389a47e7563" exitCode=0 Oct 14 13:36:31.761313 master-2 kubenswrapper[4762]: I1014 13:36:31.761316 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-g5vgq" event={"ID":"799545f6-d5b9-428c-96a7-96aa931ed940","Type":"ContainerDied","Data":"a74426201541067f0f49cc6a155bebc5ae88678dbd1e141645381389a47e7563"} Oct 14 13:36:33.190438 master-2 kubenswrapper[4762]: I1014 13:36:33.190364 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:33.404656 master-2 kubenswrapper[4762]: I1014 13:36:33.404534 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.404656 master-2 kubenswrapper[4762]: I1014 13:36:33.404583 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.404940 master-2 kubenswrapper[4762]: I1014 13:36:33.404663 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.404940 master-2 kubenswrapper[4762]: I1014 13:36:33.404703 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.404940 master-2 kubenswrapper[4762]: I1014 13:36:33.404776 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq92w\" (UniqueName: \"kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.404940 master-2 kubenswrapper[4762]: I1014 13:36:33.404873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged\") pod \"799545f6-d5b9-428c-96a7-96aa931ed940\" (UID: \"799545f6-d5b9-428c-96a7-96aa931ed940\") " Oct 14 13:36:33.405782 master-2 kubenswrapper[4762]: I1014 13:36:33.405676 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:36:33.407623 master-2 kubenswrapper[4762]: I1014 13:36:33.407594 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts" (OuterVolumeSpecName: "scripts") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:33.408326 master-2 kubenswrapper[4762]: I1014 13:36:33.408281 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w" (OuterVolumeSpecName: "kube-api-access-tq92w") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "kube-api-access-tq92w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:36:33.409720 master-2 kubenswrapper[4762]: I1014 13:36:33.409659 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 14 13:36:33.441383 master-2 kubenswrapper[4762]: I1014 13:36:33.441319 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data" (OuterVolumeSpecName: "config-data") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:33.447324 master-2 kubenswrapper[4762]: I1014 13:36:33.447270 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "799545f6-d5b9-428c-96a7-96aa931ed940" (UID: "799545f6-d5b9-428c-96a7-96aa931ed940"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:36:33.507290 master-2 kubenswrapper[4762]: I1014 13:36:33.507243 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.507290 master-2 kubenswrapper[4762]: I1014 13:36:33.507286 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.507489 master-2 kubenswrapper[4762]: I1014 13:36:33.507302 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq92w\" (UniqueName: \"kubernetes.io/projected/799545f6-d5b9-428c-96a7-96aa931ed940-kube-api-access-tq92w\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.507489 master-2 kubenswrapper[4762]: I1014 13:36:33.507318 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/799545f6-d5b9-428c-96a7-96aa931ed940-config-data-merged\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.507489 master-2 kubenswrapper[4762]: I1014 13:36:33.507330 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/799545f6-d5b9-428c-96a7-96aa931ed940-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.507489 master-2 kubenswrapper[4762]: I1014 13:36:33.507342 4762 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/799545f6-d5b9-428c-96a7-96aa931ed940-etc-podinfo\") on node \"master-2\" DevicePath \"\"" Oct 14 13:36:33.777012 master-2 kubenswrapper[4762]: I1014 13:36:33.776933 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-g5vgq" event={"ID":"799545f6-d5b9-428c-96a7-96aa931ed940","Type":"ContainerDied","Data":"d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942"} Oct 14 13:36:33.777012 master-2 kubenswrapper[4762]: I1014 13:36:33.777011 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d836c124649e13f260150a58704ce46399ae53356ed37800be691f98dcd75942" Oct 14 13:36:33.777563 master-2 kubenswrapper[4762]: I1014 13:36:33.777532 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-g5vgq" Oct 14 13:36:36.078662 master-2 kubenswrapper[4762]: I1014 13:36:36.078604 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:36:36.079774 master-2 kubenswrapper[4762]: E1014 13:36:36.079749 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" containerName="init" Oct 14 13:36:36.079886 master-2 kubenswrapper[4762]: I1014 13:36:36.079868 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" containerName="init" Oct 14 13:36:36.079993 master-2 kubenswrapper[4762]: E1014 13:36:36.079979 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" containerName="ironic-db-sync" Oct 14 13:36:36.080073 master-2 kubenswrapper[4762]: I1014 13:36:36.080060 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" containerName="ironic-db-sync" Oct 14 13:36:36.080382 master-2 kubenswrapper[4762]: I1014 13:36:36.080359 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" containerName="ironic-db-sync" Oct 14 13:36:36.082625 master-2 kubenswrapper[4762]: I1014 13:36:36.082593 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.088611 master-2 kubenswrapper[4762]: I1014 13:36:36.088512 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-scripts" Oct 14 13:36:36.088792 master-2 kubenswrapper[4762]: I1014 13:36:36.088762 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-config-data" Oct 14 13:36:36.088855 master-2 kubenswrapper[4762]: I1014 13:36:36.088803 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-api-config-data" Oct 14 13:36:36.118549 master-2 kubenswrapper[4762]: I1014 13:36:36.118486 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:36:36.162547 master-2 kubenswrapper[4762]: I1014 13:36:36.162499 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.162820 master-2 kubenswrapper[4762]: I1014 13:36:36.162803 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.162920 master-2 kubenswrapper[4762]: I1014 13:36:36.162907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.163037 master-2 kubenswrapper[4762]: I1014 13:36:36.163025 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.163176 master-2 kubenswrapper[4762]: I1014 13:36:36.163144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pxx\" (UniqueName: \"kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.163275 master-2 kubenswrapper[4762]: I1014 13:36:36.163262 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.163388 master-2 kubenswrapper[4762]: I1014 13:36:36.163371 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.265431 master-2 kubenswrapper[4762]: I1014 13:36:36.265059 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pxx\" (UniqueName: \"kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.265849 master-2 kubenswrapper[4762]: I1014 13:36:36.265831 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266074 master-2 kubenswrapper[4762]: I1014 13:36:36.266053 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266315 master-2 kubenswrapper[4762]: I1014 13:36:36.266292 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266493 master-2 kubenswrapper[4762]: I1014 13:36:36.266387 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266561 master-2 kubenswrapper[4762]: I1014 13:36:36.266453 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266723 master-2 kubenswrapper[4762]: I1014 13:36:36.266676 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.266878 master-2 kubenswrapper[4762]: I1014 13:36:36.266850 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.267297 master-2 kubenswrapper[4762]: I1014 13:36:36.267255 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.270636 master-2 kubenswrapper[4762]: I1014 13:36:36.270020 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.274911 master-2 kubenswrapper[4762]: I1014 13:36:36.274859 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.275550 master-2 kubenswrapper[4762]: I1014 13:36:36.275506 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.276253 master-2 kubenswrapper[4762]: I1014 13:36:36.276216 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-c7795fc9c-45w5w"] Oct 14 13:36:36.277299 master-2 kubenswrapper[4762]: I1014 13:36:36.277278 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.277976 master-2 kubenswrapper[4762]: I1014 13:36:36.277939 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.286141 master-2 kubenswrapper[4762]: I1014 13:36:36.286093 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Oct 14 13:36:36.311424 master-2 kubenswrapper[4762]: I1014 13:36:36.311328 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-c7795fc9c-45w5w"] Oct 14 13:36:36.330228 master-2 kubenswrapper[4762]: I1014 13:36:36.328790 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pxx\" (UniqueName: \"kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx\") pod \"cinder-46645-api-0\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.369266 master-2 kubenswrapper[4762]: I1014 13:36:36.369102 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-config\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.369266 master-2 kubenswrapper[4762]: I1014 13:36:36.369201 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w2j8\" (UniqueName: \"kubernetes.io/projected/3664024d-9ed9-48d5-9943-260774564949-kube-api-access-8w2j8\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.369583 master-2 kubenswrapper[4762]: I1014 13:36:36.369278 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-combined-ca-bundle\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.411693 master-2 kubenswrapper[4762]: I1014 13:36:36.411617 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:36:36.482237 master-2 kubenswrapper[4762]: I1014 13:36:36.481750 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-config\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.482237 master-2 kubenswrapper[4762]: I1014 13:36:36.481823 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w2j8\" (UniqueName: \"kubernetes.io/projected/3664024d-9ed9-48d5-9943-260774564949-kube-api-access-8w2j8\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.482237 master-2 kubenswrapper[4762]: I1014 13:36:36.481900 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-combined-ca-bundle\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.487728 master-2 kubenswrapper[4762]: I1014 13:36:36.487684 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-config\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.487728 master-2 kubenswrapper[4762]: I1014 13:36:36.487704 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3664024d-9ed9-48d5-9943-260774564949-combined-ca-bundle\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.518854 master-2 kubenswrapper[4762]: I1014 13:36:36.518815 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w2j8\" (UniqueName: \"kubernetes.io/projected/3664024d-9ed9-48d5-9943-260774564949-kube-api-access-8w2j8\") pod \"ironic-neutron-agent-c7795fc9c-45w5w\" (UID: \"3664024d-9ed9-48d5-9943-260774564949\") " pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.706797 master-2 kubenswrapper[4762]: I1014 13:36:36.706735 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:36.902346 master-2 kubenswrapper[4762]: I1014 13:36:36.902264 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:36:37.197196 master-2 kubenswrapper[4762]: I1014 13:36:37.190223 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:36:37.197196 master-2 kubenswrapper[4762]: I1014 13:36:37.192805 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.197196 master-2 kubenswrapper[4762]: I1014 13:36:37.196172 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-46645-default-external-config-data" Oct 14 13:36:37.197196 master-2 kubenswrapper[4762]: I1014 13:36:37.196852 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 14 13:36:37.344706 master-2 kubenswrapper[4762]: W1014 13:36:37.344633 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3664024d_9ed9_48d5_9943_260774564949.slice/crio-cf88689661b6f19fd77281bf96c961033af5906bd4827c86324b028f202f3d7d WatchSource:0}: Error finding container cf88689661b6f19fd77281bf96c961033af5906bd4827c86324b028f202f3d7d: Status 404 returned error can't find the container with id cf88689661b6f19fd77281bf96c961033af5906bd4827c86324b028f202f3d7d Oct 14 13:36:37.346970 master-2 kubenswrapper[4762]: I1014 13:36:37.346932 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-c7795fc9c-45w5w"] Oct 14 13:36:37.355381 master-2 kubenswrapper[4762]: I1014 13:36:37.355316 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:36:37.402406 master-2 kubenswrapper[4762]: I1014 13:36:37.402315 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402406 master-2 kubenswrapper[4762]: I1014 13:36:37.402381 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402829 master-2 kubenswrapper[4762]: I1014 13:36:37.402455 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6lmt\" (UniqueName: \"kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402829 master-2 kubenswrapper[4762]: I1014 13:36:37.402535 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402829 master-2 kubenswrapper[4762]: I1014 13:36:37.402578 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402829 master-2 kubenswrapper[4762]: I1014 13:36:37.402603 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.402829 master-2 kubenswrapper[4762]: I1014 13:36:37.402630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.505732 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.505907 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506079 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6lmt\" (UniqueName: \"kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506286 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506383 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506458 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506531 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.506900 master-2 kubenswrapper[4762]: I1014 13:36:37.506848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.514319 master-2 kubenswrapper[4762]: I1014 13:36:37.514275 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.514671 master-2 kubenswrapper[4762]: I1014 13:36:37.514589 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.514671 master-2 kubenswrapper[4762]: I1014 13:36:37.514604 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:36:37.514960 master-2 kubenswrapper[4762]: I1014 13:36:37.514700 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/5685f65b7fddc01b30e41430caf46764761af798608234f6a998350abdb8ec95/globalmount\"" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.517888 master-2 kubenswrapper[4762]: I1014 13:36:37.517853 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.529325 master-2 kubenswrapper[4762]: I1014 13:36:37.529282 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6lmt\" (UniqueName: \"kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:37.625027 master-2 kubenswrapper[4762]: I1014 13:36:37.621216 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:36:37.625027 master-2 kubenswrapper[4762]: I1014 13:36:37.623323 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.629354 master-2 kubenswrapper[4762]: I1014 13:36:37.627375 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-46645-default-internal-config-data" Oct 14 13:36:37.657915 master-2 kubenswrapper[4762]: I1014 13:36:37.657833 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:36:37.718016 master-2 kubenswrapper[4762]: I1014 13:36:37.717950 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckq9\" (UniqueName: \"kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718257 master-2 kubenswrapper[4762]: I1014 13:36:37.718047 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718257 master-2 kubenswrapper[4762]: I1014 13:36:37.718116 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718257 master-2 kubenswrapper[4762]: I1014 13:36:37.718205 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718418 master-2 kubenswrapper[4762]: I1014 13:36:37.718294 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718418 master-2 kubenswrapper[4762]: I1014 13:36:37.718339 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.718418 master-2 kubenswrapper[4762]: I1014 13:36:37.718382 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820253 master-2 kubenswrapper[4762]: I1014 13:36:37.820095 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hckq9\" (UniqueName: \"kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820253 master-2 kubenswrapper[4762]: I1014 13:36:37.820233 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820488 master-2 kubenswrapper[4762]: I1014 13:36:37.820301 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820488 master-2 kubenswrapper[4762]: I1014 13:36:37.820352 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820488 master-2 kubenswrapper[4762]: I1014 13:36:37.820387 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820488 master-2 kubenswrapper[4762]: I1014 13:36:37.820413 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.820488 master-2 kubenswrapper[4762]: I1014 13:36:37.820444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.821730 master-2 kubenswrapper[4762]: I1014 13:36:37.821682 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.821974 master-2 kubenswrapper[4762]: I1014 13:36:37.821938 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.822662 master-2 kubenswrapper[4762]: I1014 13:36:37.822642 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:36:37.822725 master-2 kubenswrapper[4762]: I1014 13:36:37.822671 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/85c0fdc6533d16267c0e554f30a4834f2259832a15c3a4c0788a47abd2eca85c/globalmount\"" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.825718 master-2 kubenswrapper[4762]: I1014 13:36:37.825112 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.825969 master-2 kubenswrapper[4762]: I1014 13:36:37.825719 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.837934 master-2 kubenswrapper[4762]: I1014 13:36:37.837857 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerStarted","Data":"cf88689661b6f19fd77281bf96c961033af5906bd4827c86324b028f202f3d7d"} Oct 14 13:36:37.839232 master-2 kubenswrapper[4762]: I1014 13:36:37.839168 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerStarted","Data":"1f2a1cc825c0bc9d817d14278d1d450d401ff127d85874f61dfb2b91a65a0688"} Oct 14 13:36:37.840532 master-2 kubenswrapper[4762]: I1014 13:36:37.840482 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:37.843230 master-2 kubenswrapper[4762]: I1014 13:36:37.842272 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckq9\" (UniqueName: \"kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:38.904407 master-2 kubenswrapper[4762]: I1014 13:36:38.904068 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:39.015307 master-2 kubenswrapper[4762]: I1014 13:36:39.015251 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:36:39.707350 master-2 kubenswrapper[4762]: W1014 13:36:39.706049 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1a658e_0a8c_416a_9624_f80c6fbacde7.slice/crio-a370eba9411b5d8bb312cfd8f3a4801fff3be55042bf68f488bf338a5164dcd0 WatchSource:0}: Error finding container a370eba9411b5d8bb312cfd8f3a4801fff3be55042bf68f488bf338a5164dcd0: Status 404 returned error can't find the container with id a370eba9411b5d8bb312cfd8f3a4801fff3be55042bf68f488bf338a5164dcd0 Oct 14 13:36:39.710301 master-2 kubenswrapper[4762]: I1014 13:36:39.709026 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:36:39.862421 master-2 kubenswrapper[4762]: I1014 13:36:39.862373 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerStarted","Data":"a370eba9411b5d8bb312cfd8f3a4801fff3be55042bf68f488bf338a5164dcd0"} Oct 14 13:36:39.864596 master-2 kubenswrapper[4762]: I1014 13:36:39.864540 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerStarted","Data":"ef6294ece758f7b8b5a4ae38e7be2f7112a3e48f68792da1121d24c275902445"} Oct 14 13:36:39.864752 master-2 kubenswrapper[4762]: I1014 13:36:39.864694 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:39.898188 master-2 kubenswrapper[4762]: I1014 13:36:39.897753 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podStartSLOduration=2.041902178 podStartE2EDuration="3.897722536s" podCreationTimestamp="2025-10-14 13:36:36 +0000 UTC" firstStartedPulling="2025-10-14 13:36:37.34750084 +0000 UTC m=+1826.591660019" lastFinishedPulling="2025-10-14 13:36:39.203321208 +0000 UTC m=+1828.447480377" observedRunningTime="2025-10-14 13:36:39.885103104 +0000 UTC m=+1829.129262263" watchObservedRunningTime="2025-10-14 13:36:39.897722536 +0000 UTC m=+1829.141881695" Oct 14 13:36:40.345625 master-2 kubenswrapper[4762]: I1014 13:36:40.345531 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:40.377058 master-2 kubenswrapper[4762]: I1014 13:36:40.376979 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:36:40.986354 master-2 kubenswrapper[4762]: I1014 13:36:40.986308 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:36:41.465735 master-2 kubenswrapper[4762]: I1014 13:36:41.465675 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Oct 14 13:36:41.471931 master-2 kubenswrapper[4762]: I1014 13:36:41.471877 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Oct 14 13:36:41.474886 master-2 kubenswrapper[4762]: I1014 13:36:41.474847 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Oct 14 13:36:41.475012 master-2 kubenswrapper[4762]: I1014 13:36:41.474901 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Oct 14 13:36:41.475247 master-2 kubenswrapper[4762]: I1014 13:36:41.475207 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Oct 14 13:36:41.480508 master-2 kubenswrapper[4762]: I1014 13:36:41.476785 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Oct 14 13:36:41.546807 master-2 kubenswrapper[4762]: I1014 13:36:41.546741 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Oct 14 13:36:41.688948 master-2 kubenswrapper[4762]: I1014 13:36:41.688801 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-scripts\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.688948 master-2 kubenswrapper[4762]: I1014 13:36:41.688862 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.688948 master-2 kubenswrapper[4762]: I1014 13:36:41.688920 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d4da85-9147-43ff-b96c-81e9d3fffd69-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.695666 master-2 kubenswrapper[4762]: I1014 13:36:41.688976 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.695666 master-2 kubenswrapper[4762]: I1014 13:36:41.689011 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab245682-ed9b-4f3c-83d5-9fabfb0d3d77\" (UniqueName: \"kubernetes.io/csi/topolvm.io^596832ad-f00f-4c35-8d36-175945dc88d4\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.695666 master-2 kubenswrapper[4762]: I1014 13:36:41.689038 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m4kk\" (UniqueName: \"kubernetes.io/projected/d0d4da85-9147-43ff-b96c-81e9d3fffd69-kube-api-access-6m4kk\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.695666 master-2 kubenswrapper[4762]: I1014 13:36:41.689075 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.695666 master-2 kubenswrapper[4762]: I1014 13:36:41.689137 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791123 master-2 kubenswrapper[4762]: I1014 13:36:41.791015 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791123 master-2 kubenswrapper[4762]: I1014 13:36:41.791079 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab245682-ed9b-4f3c-83d5-9fabfb0d3d77\" (UniqueName: \"kubernetes.io/csi/topolvm.io^596832ad-f00f-4c35-8d36-175945dc88d4\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791123 master-2 kubenswrapper[4762]: I1014 13:36:41.791111 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m4kk\" (UniqueName: \"kubernetes.io/projected/d0d4da85-9147-43ff-b96c-81e9d3fffd69-kube-api-access-6m4kk\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791383 master-2 kubenswrapper[4762]: I1014 13:36:41.791138 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791383 master-2 kubenswrapper[4762]: I1014 13:36:41.791191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791383 master-2 kubenswrapper[4762]: I1014 13:36:41.791218 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-scripts\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791383 master-2 kubenswrapper[4762]: I1014 13:36:41.791250 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.791383 master-2 kubenswrapper[4762]: I1014 13:36:41.791310 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d4da85-9147-43ff-b96c-81e9d3fffd69-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.792202 master-2 kubenswrapper[4762]: I1014 13:36:41.792140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.793374 master-2 kubenswrapper[4762]: I1014 13:36:41.793350 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:36:41.793446 master-2 kubenswrapper[4762]: I1014 13:36:41.793388 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab245682-ed9b-4f3c-83d5-9fabfb0d3d77\" (UniqueName: \"kubernetes.io/csi/topolvm.io^596832ad-f00f-4c35-8d36-175945dc88d4\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/6ba3a24a4d44235843e1010da89085bc8f9f1d04a1eb8e2d08ea59946bcbf186/globalmount\"" pod="openstack/ironic-conductor-0" Oct 14 13:36:41.794929 master-2 kubenswrapper[4762]: I1014 13:36:41.794889 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d0d4da85-9147-43ff-b96c-81e9d3fffd69-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.795807 master-2 kubenswrapper[4762]: I1014 13:36:41.795053 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.795892 master-2 kubenswrapper[4762]: I1014 13:36:41.795834 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.796719 master-2 kubenswrapper[4762]: I1014 13:36:41.796688 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-scripts\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.809853 master-2 kubenswrapper[4762]: I1014 13:36:41.801711 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d4da85-9147-43ff-b96c-81e9d3fffd69-config-data\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.815176 master-2 kubenswrapper[4762]: I1014 13:36:41.815072 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m4kk\" (UniqueName: \"kubernetes.io/projected/d0d4da85-9147-43ff-b96c-81e9d3fffd69-kube-api-access-6m4kk\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:41.887728 master-2 kubenswrapper[4762]: I1014 13:36:41.887661 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerStarted","Data":"a119af22578a27f48f0e5e69fbb27f307abba8bfbe24e1d079868c88df48bc05"} Oct 14 13:36:42.193190 master-2 kubenswrapper[4762]: I1014 13:36:42.193036 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57bb6bd49-t2th5"] Oct 14 13:36:42.195177 master-2 kubenswrapper[4762]: I1014 13:36:42.195123 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.198023 master-2 kubenswrapper[4762]: I1014 13:36:42.197980 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 14 13:36:42.198116 master-2 kubenswrapper[4762]: I1014 13:36:42.198049 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 14 13:36:42.227545 master-2 kubenswrapper[4762]: I1014 13:36:42.224036 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57bb6bd49-t2th5"] Oct 14 13:36:42.301574 master-2 kubenswrapper[4762]: I1014 13:36:42.301501 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.301810 master-2 kubenswrapper[4762]: I1014 13:36:42.301660 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-internal-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.301810 master-2 kubenswrapper[4762]: I1014 13:36:42.301719 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-ovndb-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.301877 master-2 kubenswrapper[4762]: I1014 13:36:42.301834 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-public-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.302018 master-2 kubenswrapper[4762]: I1014 13:36:42.301970 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86hlt\" (UniqueName: \"kubernetes.io/projected/db5a2370-3cba-4680-ba3d-4243717cf1f4-kube-api-access-86hlt\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.302088 master-2 kubenswrapper[4762]: I1014 13:36:42.302060 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-combined-ca-bundle\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.302132 master-2 kubenswrapper[4762]: I1014 13:36:42.302099 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-httpd-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.404793 master-2 kubenswrapper[4762]: I1014 13:36:42.404666 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86hlt\" (UniqueName: \"kubernetes.io/projected/db5a2370-3cba-4680-ba3d-4243717cf1f4-kube-api-access-86hlt\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.404793 master-2 kubenswrapper[4762]: I1014 13:36:42.404736 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-combined-ca-bundle\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.404793 master-2 kubenswrapper[4762]: I1014 13:36:42.404754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-httpd-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.405028 master-2 kubenswrapper[4762]: I1014 13:36:42.404804 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.405028 master-2 kubenswrapper[4762]: I1014 13:36:42.404864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-internal-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.405028 master-2 kubenswrapper[4762]: I1014 13:36:42.404893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-ovndb-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.405028 master-2 kubenswrapper[4762]: I1014 13:36:42.404928 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-public-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.412046 master-2 kubenswrapper[4762]: I1014 13:36:42.411991 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-public-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.412046 master-2 kubenswrapper[4762]: I1014 13:36:42.412013 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-httpd-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.415139 master-2 kubenswrapper[4762]: I1014 13:36:42.413819 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-internal-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.415139 master-2 kubenswrapper[4762]: I1014 13:36:42.414273 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-combined-ca-bundle\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.416690 master-2 kubenswrapper[4762]: I1014 13:36:42.416522 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-config\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.418068 master-2 kubenswrapper[4762]: I1014 13:36:42.417892 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5a2370-3cba-4680-ba3d-4243717cf1f4-ovndb-tls-certs\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.434193 master-2 kubenswrapper[4762]: I1014 13:36:42.434124 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86hlt\" (UniqueName: \"kubernetes.io/projected/db5a2370-3cba-4680-ba3d-4243717cf1f4-kube-api-access-86hlt\") pod \"neutron-57bb6bd49-t2th5\" (UID: \"db5a2370-3cba-4680-ba3d-4243717cf1f4\") " pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.531437 master-2 kubenswrapper[4762]: I1014 13:36:42.531237 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:42.897763 master-2 kubenswrapper[4762]: I1014 13:36:42.897717 4762 generic.go:334] "Generic (PLEG): container finished" podID="3664024d-9ed9-48d5-9943-260774564949" containerID="ef6294ece758f7b8b5a4ae38e7be2f7112a3e48f68792da1121d24c275902445" exitCode=1 Oct 14 13:36:42.897763 master-2 kubenswrapper[4762]: I1014 13:36:42.897762 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerDied","Data":"ef6294ece758f7b8b5a4ae38e7be2f7112a3e48f68792da1121d24c275902445"} Oct 14 13:36:42.898509 master-2 kubenswrapper[4762]: I1014 13:36:42.898477 4762 scope.go:117] "RemoveContainer" containerID="ef6294ece758f7b8b5a4ae38e7be2f7112a3e48f68792da1121d24c275902445" Oct 14 13:36:43.139912 master-2 kubenswrapper[4762]: I1014 13:36:43.139806 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57bb6bd49-t2th5"] Oct 14 13:36:43.244276 master-2 kubenswrapper[4762]: I1014 13:36:43.244165 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab245682-ed9b-4f3c-83d5-9fabfb0d3d77\" (UniqueName: \"kubernetes.io/csi/topolvm.io^596832ad-f00f-4c35-8d36-175945dc88d4\") pod \"ironic-conductor-0\" (UID: \"d0d4da85-9147-43ff-b96c-81e9d3fffd69\") " pod="openstack/ironic-conductor-0" Oct 14 13:36:43.296048 master-2 kubenswrapper[4762]: I1014 13:36:43.296002 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Oct 14 13:36:43.911192 master-2 kubenswrapper[4762]: I1014 13:36:43.907339 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57bb6bd49-t2th5" event={"ID":"db5a2370-3cba-4680-ba3d-4243717cf1f4","Type":"ContainerStarted","Data":"6f2095d633552451155a56fe7996ef3b16f3464bab1d5d787d80d89a02a8d243"} Oct 14 13:36:43.911192 master-2 kubenswrapper[4762]: I1014 13:36:43.907390 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57bb6bd49-t2th5" event={"ID":"db5a2370-3cba-4680-ba3d-4243717cf1f4","Type":"ContainerStarted","Data":"43bf241bbb97ac5602571a57dea2ea1ea9136374b583650dc09b1688c2d13646"} Oct 14 13:36:43.911192 master-2 kubenswrapper[4762]: I1014 13:36:43.907399 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57bb6bd49-t2th5" event={"ID":"db5a2370-3cba-4680-ba3d-4243717cf1f4","Type":"ContainerStarted","Data":"4f99305e9c53d81782bc38fb0d33c22df4eecda71b1db4fd34387f5bbe2a3d52"} Oct 14 13:36:43.911192 master-2 kubenswrapper[4762]: I1014 13:36:43.907670 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:36:43.911192 master-2 kubenswrapper[4762]: I1014 13:36:43.911147 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerStarted","Data":"e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e"} Oct 14 13:36:43.916181 master-2 kubenswrapper[4762]: I1014 13:36:43.911814 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:43.926214 master-2 kubenswrapper[4762]: I1014 13:36:43.925586 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Oct 14 13:36:43.936291 master-2 kubenswrapper[4762]: W1014 13:36:43.934771 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0d4da85_9147_43ff_b96c_81e9d3fffd69.slice/crio-21680cfdf905087f1a3b226695ca8bf132af73a3b7e2f86d60a1d9f3ed0a1e19 WatchSource:0}: Error finding container 21680cfdf905087f1a3b226695ca8bf132af73a3b7e2f86d60a1d9f3ed0a1e19: Status 404 returned error can't find the container with id 21680cfdf905087f1a3b226695ca8bf132af73a3b7e2f86d60a1d9f3ed0a1e19 Oct 14 13:36:43.941468 master-2 kubenswrapper[4762]: I1014 13:36:43.941171 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57bb6bd49-t2th5" podStartSLOduration=1.941137725 podStartE2EDuration="1.941137725s" podCreationTimestamp="2025-10-14 13:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:36:43.940134583 +0000 UTC m=+1833.184293742" watchObservedRunningTime="2025-10-14 13:36:43.941137725 +0000 UTC m=+1833.185296884" Oct 14 13:36:44.932886 master-2 kubenswrapper[4762]: I1014 13:36:44.932817 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"9ed8675a8580f3218b073cfde07b8219c2bff6612026e36dc92f1c1902d3f133"} Oct 14 13:36:44.932886 master-2 kubenswrapper[4762]: I1014 13:36:44.932870 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"21680cfdf905087f1a3b226695ca8bf132af73a3b7e2f86d60a1d9f3ed0a1e19"} Oct 14 13:36:45.952784 master-2 kubenswrapper[4762]: I1014 13:36:45.952675 4762 generic.go:334] "Generic (PLEG): container finished" podID="3664024d-9ed9-48d5-9943-260774564949" containerID="e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e" exitCode=1 Oct 14 13:36:45.956285 master-2 kubenswrapper[4762]: I1014 13:36:45.953577 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerDied","Data":"e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e"} Oct 14 13:36:45.956285 master-2 kubenswrapper[4762]: I1014 13:36:45.953613 4762 scope.go:117] "RemoveContainer" containerID="ef6294ece758f7b8b5a4ae38e7be2f7112a3e48f68792da1121d24c275902445" Oct 14 13:36:45.956285 master-2 kubenswrapper[4762]: I1014 13:36:45.953895 4762 scope.go:117] "RemoveContainer" containerID="e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e" Oct 14 13:36:45.956285 master-2 kubenswrapper[4762]: E1014 13:36:45.954142 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c7795fc9c-45w5w_openstack(3664024d-9ed9-48d5-9943-260774564949)\"" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podUID="3664024d-9ed9-48d5-9943-260774564949" Oct 14 13:36:46.707400 master-2 kubenswrapper[4762]: I1014 13:36:46.707323 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:36:46.986338 master-2 kubenswrapper[4762]: I1014 13:36:46.986281 4762 scope.go:117] "RemoveContainer" containerID="e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e" Oct 14 13:36:46.987451 master-2 kubenswrapper[4762]: E1014 13:36:46.986607 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c7795fc9c-45w5w_openstack(3664024d-9ed9-48d5-9943-260774564949)\"" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podUID="3664024d-9ed9-48d5-9943-260774564949" Oct 14 13:36:47.798522 master-2 kubenswrapper[4762]: I1014 13:36:47.798458 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5755976884-tp9vz" Oct 14 13:36:49.025658 master-2 kubenswrapper[4762]: I1014 13:36:49.025610 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d4da85-9147-43ff-b96c-81e9d3fffd69" containerID="9ed8675a8580f3218b073cfde07b8219c2bff6612026e36dc92f1c1902d3f133" exitCode=0 Oct 14 13:36:49.026407 master-2 kubenswrapper[4762]: I1014 13:36:49.025663 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerDied","Data":"9ed8675a8580f3218b073cfde07b8219c2bff6612026e36dc92f1c1902d3f133"} Oct 14 13:36:51.648766 master-2 kubenswrapper[4762]: I1014 13:36:51.648661 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:51.681876 master-2 kubenswrapper[4762]: I1014 13:36:51.681813 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-958c54db4-kctqr" Oct 14 13:36:52.286119 master-2 kubenswrapper[4762]: I1014 13:36:52.286030 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:36:56.517136 master-2 kubenswrapper[4762]: I1014 13:36:56.517067 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:36:56.517982 master-2 kubenswrapper[4762]: I1014 13:36:56.517400 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-central-agent" containerID="cri-o://47a8e63e4893125c500a5bc20abcb6fe8e2232eb088778319cd60a01e795d603" gracePeriod=30 Oct 14 13:36:56.517982 master-2 kubenswrapper[4762]: I1014 13:36:56.517494 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="sg-core" containerID="cri-o://0ab705a863170920c4fd266182f50ebd265e98f57ca33ed14326881c3a32c5e1" gracePeriod=30 Oct 14 13:36:56.517982 master-2 kubenswrapper[4762]: I1014 13:36:56.517529 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-notification-agent" containerID="cri-o://3efc717259ce9ea1d1dd607ab8c6145fce7f07f59ec75ecc3276645b81ae5372" gracePeriod=30 Oct 14 13:36:56.517982 master-2 kubenswrapper[4762]: I1014 13:36:56.517561 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="proxy-httpd" containerID="cri-o://3dacb932f68868cabc25518a739adf2c105cb3c0cebad766785c017f886a5c76" gracePeriod=30 Oct 14 13:36:56.529755 master-2 kubenswrapper[4762]: I1014 13:36:56.529697 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149536 4762 generic.go:334] "Generic (PLEG): container finished" podID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerID="3dacb932f68868cabc25518a739adf2c105cb3c0cebad766785c017f886a5c76" exitCode=0 Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149580 4762 generic.go:334] "Generic (PLEG): container finished" podID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerID="0ab705a863170920c4fd266182f50ebd265e98f57ca33ed14326881c3a32c5e1" exitCode=2 Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149594 4762 generic.go:334] "Generic (PLEG): container finished" podID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerID="47a8e63e4893125c500a5bc20abcb6fe8e2232eb088778319cd60a01e795d603" exitCode=0 Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149618 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerDied","Data":"3dacb932f68868cabc25518a739adf2c105cb3c0cebad766785c017f886a5c76"} Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149651 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerDied","Data":"0ab705a863170920c4fd266182f50ebd265e98f57ca33ed14326881c3a32c5e1"} Oct 14 13:36:57.150906 master-2 kubenswrapper[4762]: I1014 13:36:57.149666 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerDied","Data":"47a8e63e4893125c500a5bc20abcb6fe8e2232eb088778319cd60a01e795d603"} Oct 14 13:36:57.548961 master-2 kubenswrapper[4762]: I1014 13:36:57.548913 4762 scope.go:117] "RemoveContainer" containerID="e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e" Oct 14 13:36:58.098479 master-2 kubenswrapper[4762]: I1014 13:36:58.098417 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:36:58.099732 master-2 kubenswrapper[4762]: I1014 13:36:58.099691 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.102367 master-2 kubenswrapper[4762]: I1014 13:36:58.102179 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 14 13:36:58.102980 master-2 kubenswrapper[4762]: I1014 13:36:58.102854 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 14 13:36:58.140250 master-2 kubenswrapper[4762]: I1014 13:36:58.140124 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:36:58.212182 master-2 kubenswrapper[4762]: I1014 13:36:58.212095 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgbm\" (UniqueName: \"kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.212660 master-2 kubenswrapper[4762]: I1014 13:36:58.212635 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.213720 master-2 kubenswrapper[4762]: I1014 13:36:58.212863 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.213929 master-2 kubenswrapper[4762]: I1014 13:36:58.213906 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.315845 master-2 kubenswrapper[4762]: I1014 13:36:58.315787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgbm\" (UniqueName: \"kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.315845 master-2 kubenswrapper[4762]: I1014 13:36:58.315855 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.316177 master-2 kubenswrapper[4762]: I1014 13:36:58.315917 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.316177 master-2 kubenswrapper[4762]: I1014 13:36:58.315956 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.320915 master-2 kubenswrapper[4762]: I1014 13:36:58.320872 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.321711 master-2 kubenswrapper[4762]: I1014 13:36:58.321642 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.321945 master-2 kubenswrapper[4762]: I1014 13:36:58.321916 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.343927 master-2 kubenswrapper[4762]: I1014 13:36:58.343885 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgbm\" (UniqueName: \"kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm\") pod \"heat-cfnapi-588df4cd6f-vxqld\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:36:58.423412 master-2 kubenswrapper[4762]: I1014 13:36:58.423356 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:37:02.205261 master-2 kubenswrapper[4762]: I1014 13:37:02.204204 4762 generic.go:334] "Generic (PLEG): container finished" podID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerID="3efc717259ce9ea1d1dd607ab8c6145fce7f07f59ec75ecc3276645b81ae5372" exitCode=0 Oct 14 13:37:02.205261 master-2 kubenswrapper[4762]: I1014 13:37:02.204235 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerDied","Data":"3efc717259ce9ea1d1dd607ab8c6145fce7f07f59ec75ecc3276645b81ae5372"} Oct 14 13:37:02.437122 master-2 kubenswrapper[4762]: I1014 13:37:02.437075 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:02.523506 master-2 kubenswrapper[4762]: I1014 13:37:02.523446 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523725 master-2 kubenswrapper[4762]: I1014 13:37:02.523515 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523725 master-2 kubenswrapper[4762]: I1014 13:37:02.523579 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523725 master-2 kubenswrapper[4762]: I1014 13:37:02.523639 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523725 master-2 kubenswrapper[4762]: I1014 13:37:02.523679 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523892 master-2 kubenswrapper[4762]: I1014 13:37:02.523759 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.523892 master-2 kubenswrapper[4762]: I1014 13:37:02.523824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlr44\" (UniqueName: \"kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44\") pod \"cec9fba4-1a28-472b-976b-3fd3d9367772\" (UID: \"cec9fba4-1a28-472b-976b-3fd3d9367772\") " Oct 14 13:37:02.525780 master-2 kubenswrapper[4762]: I1014 13:37:02.525441 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:02.526914 master-2 kubenswrapper[4762]: I1014 13:37:02.526757 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:02.527627 master-2 kubenswrapper[4762]: I1014 13:37:02.527583 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44" (OuterVolumeSpecName: "kube-api-access-jlr44") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "kube-api-access-jlr44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:02.530548 master-2 kubenswrapper[4762]: I1014 13:37:02.530499 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts" (OuterVolumeSpecName: "scripts") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:02.561110 master-2 kubenswrapper[4762]: I1014 13:37:02.556203 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:02.590511 master-2 kubenswrapper[4762]: I1014 13:37:02.590458 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:37:02.599079 master-2 kubenswrapper[4762]: I1014 13:37:02.599015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:02.626072 master-2 kubenswrapper[4762]: I1014 13:37:02.626022 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.626183 master-2 kubenswrapper[4762]: I1014 13:37:02.626076 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlr44\" (UniqueName: \"kubernetes.io/projected/cec9fba4-1a28-472b-976b-3fd3d9367772-kube-api-access-jlr44\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.626183 master-2 kubenswrapper[4762]: I1014 13:37:02.626087 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.626183 master-2 kubenswrapper[4762]: I1014 13:37:02.626095 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cec9fba4-1a28-472b-976b-3fd3d9367772-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.626183 master-2 kubenswrapper[4762]: I1014 13:37:02.626103 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.626183 master-2 kubenswrapper[4762]: I1014 13:37:02.626132 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:02.672083 master-2 kubenswrapper[4762]: I1014 13:37:02.672031 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data" (OuterVolumeSpecName: "config-data") pod "cec9fba4-1a28-472b-976b-3fd3d9367772" (UID: "cec9fba4-1a28-472b-976b-3fd3d9367772"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:02.727787 master-2 kubenswrapper[4762]: I1014 13:37:02.727734 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec9fba4-1a28-472b-976b-3fd3d9367772-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:03.223185 master-2 kubenswrapper[4762]: I1014 13:37:03.223096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" event={"ID":"b08db4eb-a86d-436c-9251-0326ff980d24","Type":"ContainerStarted","Data":"71f3ea437892612dbab6b76632320fc3c40478a6f3c9b5e8ffdd505e916227d1"} Oct 14 13:37:03.233272 master-2 kubenswrapper[4762]: I1014 13:37:03.233194 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerStarted","Data":"a8e56f2e7f8536d8c11eae78a35b2d13020574a622d3418aabf1fc48199d280c"} Oct 14 13:37:03.241305 master-2 kubenswrapper[4762]: I1014 13:37:03.241256 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:03.241305 master-2 kubenswrapper[4762]: I1014 13:37:03.241278 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cec9fba4-1a28-472b-976b-3fd3d9367772","Type":"ContainerDied","Data":"de0575b318761da189ae063cc60ae01f477a71cc86d4e0826bcbdf927520f730"} Oct 14 13:37:03.241519 master-2 kubenswrapper[4762]: I1014 13:37:03.241344 4762 scope.go:117] "RemoveContainer" containerID="3dacb932f68868cabc25518a739adf2c105cb3c0cebad766785c017f886a5c76" Oct 14 13:37:03.250208 master-2 kubenswrapper[4762]: I1014 13:37:03.250127 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerStarted","Data":"4ba58aeb6edcffa7e6218d4ba3fcc7dbaa369b3d5caea858ea267cd28a0f198e"} Oct 14 13:37:03.253343 master-2 kubenswrapper[4762]: I1014 13:37:03.252503 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerStarted","Data":"8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f"} Oct 14 13:37:03.253343 master-2 kubenswrapper[4762]: I1014 13:37:03.253300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:37:03.256815 master-2 kubenswrapper[4762]: I1014 13:37:03.256670 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerStarted","Data":"86ab3cecfb7678393acadd9dd9fad69de6aa001061b1d2c229088112bafb8fb3"} Oct 14 13:37:03.277894 master-2 kubenswrapper[4762]: I1014 13:37:03.277855 4762 scope.go:117] "RemoveContainer" containerID="0ab705a863170920c4fd266182f50ebd265e98f57ca33ed14326881c3a32c5e1" Oct 14 13:37:03.302301 master-2 kubenswrapper[4762]: I1014 13:37:03.302212 4762 scope.go:117] "RemoveContainer" containerID="3efc717259ce9ea1d1dd607ab8c6145fce7f07f59ec75ecc3276645b81ae5372" Oct 14 13:37:03.342772 master-2 kubenswrapper[4762]: I1014 13:37:03.342709 4762 scope.go:117] "RemoveContainer" containerID="47a8e63e4893125c500a5bc20abcb6fe8e2232eb088778319cd60a01e795d603" Oct 14 13:37:03.345927 master-2 kubenswrapper[4762]: I1014 13:37:03.345894 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:03.356627 master-2 kubenswrapper[4762]: I1014 13:37:03.356511 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:03.397696 master-2 kubenswrapper[4762]: I1014 13:37:03.397645 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: E1014 13:37:03.397974 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-notification-agent" Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: I1014 13:37:03.398000 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-notification-agent" Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: E1014 13:37:03.398020 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="proxy-httpd" Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: I1014 13:37:03.398027 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="proxy-httpd" Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: E1014 13:37:03.398054 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="sg-core" Oct 14 13:37:03.398055 master-2 kubenswrapper[4762]: I1014 13:37:03.398061 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="sg-core" Oct 14 13:37:03.398292 master-2 kubenswrapper[4762]: E1014 13:37:03.398076 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-central-agent" Oct 14 13:37:03.398292 master-2 kubenswrapper[4762]: I1014 13:37:03.398082 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-central-agent" Oct 14 13:37:03.399416 master-2 kubenswrapper[4762]: I1014 13:37:03.399346 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-central-agent" Oct 14 13:37:03.399476 master-2 kubenswrapper[4762]: I1014 13:37:03.399465 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="proxy-httpd" Oct 14 13:37:03.399511 master-2 kubenswrapper[4762]: I1014 13:37:03.399488 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="sg-core" Oct 14 13:37:03.399511 master-2 kubenswrapper[4762]: I1014 13:37:03.399503 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" containerName="ceilometer-notification-agent" Oct 14 13:37:03.402058 master-2 kubenswrapper[4762]: I1014 13:37:03.401968 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:03.409331 master-2 kubenswrapper[4762]: I1014 13:37:03.404655 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:37:03.409331 master-2 kubenswrapper[4762]: I1014 13:37:03.405085 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:37:03.446703 master-2 kubenswrapper[4762]: I1014 13:37:03.445585 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.446892 master-2 kubenswrapper[4762]: I1014 13:37:03.446817 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.447754 master-2 kubenswrapper[4762]: I1014 13:37:03.447715 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.447901 master-2 kubenswrapper[4762]: I1014 13:37:03.447868 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftsg\" (UniqueName: \"kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.447983 master-2 kubenswrapper[4762]: I1014 13:37:03.447957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.448144 master-2 kubenswrapper[4762]: I1014 13:37:03.448112 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.448223 master-2 kubenswrapper[4762]: I1014 13:37:03.448153 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.450367 master-2 kubenswrapper[4762]: I1014 13:37:03.450332 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:03.550795 master-2 kubenswrapper[4762]: I1014 13:37:03.550738 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.550989 master-2 kubenswrapper[4762]: I1014 13:37:03.550862 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.550989 master-2 kubenswrapper[4762]: I1014 13:37:03.550903 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dftsg\" (UniqueName: \"kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.550989 master-2 kubenswrapper[4762]: I1014 13:37:03.550930 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.550989 master-2 kubenswrapper[4762]: I1014 13:37:03.550960 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.550989 master-2 kubenswrapper[4762]: I1014 13:37:03.550979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.551385 master-2 kubenswrapper[4762]: I1014 13:37:03.551028 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.551804 master-2 kubenswrapper[4762]: I1014 13:37:03.551757 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.551891 master-2 kubenswrapper[4762]: I1014 13:37:03.551848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.557442 master-2 kubenswrapper[4762]: I1014 13:37:03.557386 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.559017 master-2 kubenswrapper[4762]: I1014 13:37:03.558971 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.559490 master-2 kubenswrapper[4762]: I1014 13:37:03.559420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.561399 master-2 kubenswrapper[4762]: I1014 13:37:03.561364 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec9fba4-1a28-472b-976b-3fd3d9367772" path="/var/lib/kubelet/pods/cec9fba4-1a28-472b-976b-3fd3d9367772/volumes" Oct 14 13:37:03.562346 master-2 kubenswrapper[4762]: I1014 13:37:03.562315 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.568858 master-2 kubenswrapper[4762]: I1014 13:37:03.568804 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftsg\" (UniqueName: \"kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg\") pod \"ceilometer-0\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " pod="openstack/ceilometer-0" Oct 14 13:37:03.723242 master-2 kubenswrapper[4762]: I1014 13:37:03.723123 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:04.267058 master-2 kubenswrapper[4762]: I1014 13:37:04.266911 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerStarted","Data":"7cb560a1b4d4a005a2217fdb0d36facc44578a3519e2d034432c2066e64b748a"} Oct 14 13:37:04.271382 master-2 kubenswrapper[4762]: I1014 13:37:04.271323 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerStarted","Data":"84254745acf9fd64f6f181845e657957564db2cbf5ca52933f4618e0d0dd9383"} Oct 14 13:37:04.272292 master-2 kubenswrapper[4762]: I1014 13:37:04.272260 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-46645-api-0" Oct 14 13:37:04.287891 master-2 kubenswrapper[4762]: I1014 13:37:04.287824 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerStarted","Data":"7ed44c687a7980777262a5379a608149b92db276eaf5e68065ae60ccb661da2a"} Oct 14 13:37:04.323604 master-2 kubenswrapper[4762]: I1014 13:37:04.323458 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-46645-default-internal-api-0" podStartSLOduration=15.130557185 podStartE2EDuration="36.323439877s" podCreationTimestamp="2025-10-14 13:36:28 +0000 UTC" firstStartedPulling="2025-10-14 13:36:40.990996751 +0000 UTC m=+1830.235155910" lastFinishedPulling="2025-10-14 13:37:02.183879453 +0000 UTC m=+1851.428038602" observedRunningTime="2025-10-14 13:37:04.317008716 +0000 UTC m=+1853.561167875" watchObservedRunningTime="2025-10-14 13:37:04.323439877 +0000 UTC m=+1853.567599036" Oct 14 13:37:04.373714 master-2 kubenswrapper[4762]: I1014 13:37:04.373607 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-46645-default-external-api-1" podStartSLOduration=14.959415563 podStartE2EDuration="37.37358907s" podCreationTimestamp="2025-10-14 13:36:27 +0000 UTC" firstStartedPulling="2025-10-14 13:36:39.707516027 +0000 UTC m=+1828.951675186" lastFinishedPulling="2025-10-14 13:37:02.121689534 +0000 UTC m=+1851.365848693" observedRunningTime="2025-10-14 13:37:04.368693677 +0000 UTC m=+1853.612852846" watchObservedRunningTime="2025-10-14 13:37:04.37358907 +0000 UTC m=+1853.617748249" Oct 14 13:37:04.404536 master-2 kubenswrapper[4762]: I1014 13:37:04.404444 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-46645-api-0" podStartSLOduration=3.212064641 podStartE2EDuration="28.404427252s" podCreationTimestamp="2025-10-14 13:36:36 +0000 UTC" firstStartedPulling="2025-10-14 13:36:36.904995845 +0000 UTC m=+1826.149155004" lastFinishedPulling="2025-10-14 13:37:02.097358456 +0000 UTC m=+1851.341517615" observedRunningTime="2025-10-14 13:37:04.401460799 +0000 UTC m=+1853.645619978" watchObservedRunningTime="2025-10-14 13:37:04.404427252 +0000 UTC m=+1853.648586411" Oct 14 13:37:04.445408 master-2 kubenswrapper[4762]: I1014 13:37:04.445287 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:04.499903 master-2 kubenswrapper[4762]: I1014 13:37:04.499842 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:05.297448 master-2 kubenswrapper[4762]: I1014 13:37:05.297378 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:05.298473 master-2 kubenswrapper[4762]: I1014 13:37:05.298453 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.318931 master-2 kubenswrapper[4762]: I1014 13:37:05.318874 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerStarted","Data":"0984c84818c75f3808fd0e7926c4cf1db985111b51d46ec3860b1623af674422"} Oct 14 13:37:05.318931 master-2 kubenswrapper[4762]: I1014 13:37:05.318931 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerStarted","Data":"5633215fa77b6696d27ec7a683e6d1b35b1b621d2da3d5afe06485bfd5ea4f2d"} Oct 14 13:37:05.343445 master-2 kubenswrapper[4762]: I1014 13:37:05.343011 4762 generic.go:334] "Generic (PLEG): container finished" podID="3664024d-9ed9-48d5-9943-260774564949" containerID="8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f" exitCode=1 Oct 14 13:37:05.343445 master-2 kubenswrapper[4762]: I1014 13:37:05.343090 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerDied","Data":"8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f"} Oct 14 13:37:05.343445 master-2 kubenswrapper[4762]: I1014 13:37:05.343233 4762 scope.go:117] "RemoveContainer" containerID="e03cd8140915d9dd6591a72cd9571c5380834f5481a28bb4f9632b27d3c92f1e" Oct 14 13:37:05.345994 master-2 kubenswrapper[4762]: I1014 13:37:05.345907 4762 scope.go:117] "RemoveContainer" containerID="8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f" Oct 14 13:37:05.346519 master-2 kubenswrapper[4762]: E1014 13:37:05.346479 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c7795fc9c-45w5w_openstack(3664024d-9ed9-48d5-9943-260774564949)\"" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podUID="3664024d-9ed9-48d5-9943-260774564949" Oct 14 13:37:05.358506 master-2 kubenswrapper[4762]: I1014 13:37:05.358437 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:05.388102 master-2 kubenswrapper[4762]: I1014 13:37:05.388044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.388352 master-2 kubenswrapper[4762]: I1014 13:37:05.388164 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.388352 master-2 kubenswrapper[4762]: I1014 13:37:05.388222 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbr9\" (UniqueName: \"kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.388434 master-2 kubenswrapper[4762]: I1014 13:37:05.388405 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.496440 master-2 kubenswrapper[4762]: I1014 13:37:05.496377 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.496658 master-2 kubenswrapper[4762]: I1014 13:37:05.496582 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.496658 master-2 kubenswrapper[4762]: I1014 13:37:05.496642 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phbr9\" (UniqueName: \"kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.498691 master-2 kubenswrapper[4762]: I1014 13:37:05.498670 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.500745 master-2 kubenswrapper[4762]: I1014 13:37:05.499155 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.504528 master-2 kubenswrapper[4762]: I1014 13:37:05.504489 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.512759 master-2 kubenswrapper[4762]: I1014 13:37:05.512705 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.529321 master-2 kubenswrapper[4762]: I1014 13:37:05.529167 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phbr9\" (UniqueName: \"kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9\") pod \"heat-cfnapi-6b955f9d4b-jsrg8\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:05.631504 master-2 kubenswrapper[4762]: I1014 13:37:05.631440 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:06.128517 master-2 kubenswrapper[4762]: I1014 13:37:06.128465 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:06.602250 master-2 kubenswrapper[4762]: I1014 13:37:06.602184 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:37:06.647969 master-2 kubenswrapper[4762]: I1014 13:37:06.645580 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c687bcb45-wwnml"] Oct 14 13:37:06.648508 master-2 kubenswrapper[4762]: I1014 13:37:06.648474 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.650964 master-2 kubenswrapper[4762]: I1014 13:37:06.650926 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 14 13:37:06.651157 master-2 kubenswrapper[4762]: I1014 13:37:06.651104 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 14 13:37:06.674611 master-2 kubenswrapper[4762]: I1014 13:37:06.674533 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c687bcb45-wwnml"] Oct 14 13:37:06.707970 master-2 kubenswrapper[4762]: I1014 13:37:06.707840 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:37:06.708630 master-2 kubenswrapper[4762]: I1014 13:37:06.708607 4762 scope.go:117] "RemoveContainer" containerID="8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f" Oct 14 13:37:06.708862 master-2 kubenswrapper[4762]: E1014 13:37:06.708831 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c7795fc9c-45w5w_openstack(3664024d-9ed9-48d5-9943-260774564949)\"" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podUID="3664024d-9ed9-48d5-9943-260774564949" Oct 14 13:37:06.735474 master-2 kubenswrapper[4762]: I1014 13:37:06.735404 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-public-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.735474 master-2 kubenswrapper[4762]: I1014 13:37:06.735478 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-internal-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.735896 master-2 kubenswrapper[4762]: I1014 13:37:06.735509 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-combined-ca-bundle\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.735896 master-2 kubenswrapper[4762]: I1014 13:37:06.735533 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data-custom\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.735896 master-2 kubenswrapper[4762]: I1014 13:37:06.735573 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdhbp\" (UniqueName: \"kubernetes.io/projected/a7491603-0a38-4539-b476-ba9a70d5ff86-kube-api-access-hdhbp\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.735896 master-2 kubenswrapper[4762]: I1014 13:37:06.735840 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.837917 master-2 kubenswrapper[4762]: I1014 13:37:06.837831 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdhbp\" (UniqueName: \"kubernetes.io/projected/a7491603-0a38-4539-b476-ba9a70d5ff86-kube-api-access-hdhbp\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.837917 master-2 kubenswrapper[4762]: I1014 13:37:06.837922 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.839540 master-2 kubenswrapper[4762]: I1014 13:37:06.837969 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-public-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.839540 master-2 kubenswrapper[4762]: I1014 13:37:06.838004 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-internal-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.839540 master-2 kubenswrapper[4762]: I1014 13:37:06.838035 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-combined-ca-bundle\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.839540 master-2 kubenswrapper[4762]: I1014 13:37:06.838061 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data-custom\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.845094 master-2 kubenswrapper[4762]: I1014 13:37:06.845041 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-internal-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.845570 master-2 kubenswrapper[4762]: I1014 13:37:06.845389 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-combined-ca-bundle\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.846213 master-2 kubenswrapper[4762]: I1014 13:37:06.846169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.846650 master-2 kubenswrapper[4762]: I1014 13:37:06.846577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-public-tls-certs\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.850180 master-2 kubenswrapper[4762]: I1014 13:37:06.850117 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7491603-0a38-4539-b476-ba9a70d5ff86-config-data-custom\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.870100 master-2 kubenswrapper[4762]: I1014 13:37:06.869997 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdhbp\" (UniqueName: \"kubernetes.io/projected/a7491603-0a38-4539-b476-ba9a70d5ff86-kube-api-access-hdhbp\") pod \"heat-cfnapi-6c687bcb45-wwnml\" (UID: \"a7491603-0a38-4539-b476-ba9a70d5ff86\") " pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:06.968265 master-2 kubenswrapper[4762]: I1014 13:37:06.968032 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:08.861454 master-2 kubenswrapper[4762]: I1014 13:37:08.861402 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:37:08.862680 master-2 kubenswrapper[4762]: I1014 13:37:08.861700 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="dnsmasq-dns" containerID="cri-o://e0f1acf8e74c67b52d78ce1b9f7d41c464bee5c142ef00e5bb5cef8bcd42d9a5" gracePeriod=10 Oct 14 13:37:09.017356 master-2 kubenswrapper[4762]: I1014 13:37:09.017287 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.017356 master-2 kubenswrapper[4762]: I1014 13:37:09.017341 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.017356 master-2 kubenswrapper[4762]: I1014 13:37:09.017352 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.017356 master-2 kubenswrapper[4762]: I1014 13:37:09.017363 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.043685 master-2 kubenswrapper[4762]: I1014 13:37:09.043633 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.064237 master-2 kubenswrapper[4762]: I1014 13:37:09.064200 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:09.401761 master-2 kubenswrapper[4762]: I1014 13:37:09.401713 4762 generic.go:334] "Generic (PLEG): container finished" podID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerID="e0f1acf8e74c67b52d78ce1b9f7d41c464bee5c142ef00e5bb5cef8bcd42d9a5" exitCode=0 Oct 14 13:37:09.402046 master-2 kubenswrapper[4762]: I1014 13:37:09.402015 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" event={"ID":"3b8400ee-6062-47fd-b45c-bc82fbdc92cd","Type":"ContainerDied","Data":"e0f1acf8e74c67b52d78ce1b9f7d41c464bee5c142ef00e5bb5cef8bcd42d9a5"} Oct 14 13:37:10.377789 master-2 kubenswrapper[4762]: I1014 13:37:10.377671 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:10.378373 master-2 kubenswrapper[4762]: I1014 13:37:10.377925 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:10.378373 master-2 kubenswrapper[4762]: I1014 13:37:10.377999 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:10.378843 master-2 kubenswrapper[4762]: I1014 13:37:10.378010 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:10.405342 master-2 kubenswrapper[4762]: I1014 13:37:10.405294 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:10.429750 master-2 kubenswrapper[4762]: I1014 13:37:10.429693 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:11.196440 master-2 kubenswrapper[4762]: W1014 13:37:11.196384 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f99144b_486d_4a4b_8b68_294fd70a0f49.slice/crio-bcc28ada1134263390475214aa27377647298bc742bb11b35c3be3cb6ca56023 WatchSource:0}: Error finding container bcc28ada1134263390475214aa27377647298bc742bb11b35c3be3cb6ca56023: Status 404 returned error can't find the container with id bcc28ada1134263390475214aa27377647298bc742bb11b35c3be3cb6ca56023 Oct 14 13:37:11.433210 master-2 kubenswrapper[4762]: I1014 13:37:11.433131 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" event={"ID":"3f99144b-486d-4a4b-8b68-294fd70a0f49","Type":"ContainerStarted","Data":"bcc28ada1134263390475214aa27377647298bc742bb11b35c3be3cb6ca56023"} Oct 14 13:37:11.511888 master-2 kubenswrapper[4762]: I1014 13:37:11.511726 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:11.511888 master-2 kubenswrapper[4762]: I1014 13:37:11.511836 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:37:11.544527 master-2 kubenswrapper[4762]: I1014 13:37:11.544477 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:37:11.654249 master-2 kubenswrapper[4762]: I1014 13:37:11.654099 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:11.679063 master-2 kubenswrapper[4762]: I1014 13:37:11.678935 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.679300 master-2 kubenswrapper[4762]: I1014 13:37:11.679185 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.679339 master-2 kubenswrapper[4762]: I1014 13:37:11.679301 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.679374 master-2 kubenswrapper[4762]: I1014 13:37:11.679342 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.679436 master-2 kubenswrapper[4762]: I1014 13:37:11.679404 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.679541 master-2 kubenswrapper[4762]: I1014 13:37:11.679510 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8z98\" (UniqueName: \"kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98\") pod \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\" (UID: \"3b8400ee-6062-47fd-b45c-bc82fbdc92cd\") " Oct 14 13:37:11.709102 master-2 kubenswrapper[4762]: I1014 13:37:11.708952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98" (OuterVolumeSpecName: "kube-api-access-k8z98") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "kube-api-access-k8z98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:11.784722 master-2 kubenswrapper[4762]: I1014 13:37:11.784541 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8z98\" (UniqueName: \"kubernetes.io/projected/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-kube-api-access-k8z98\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:11.789835 master-2 kubenswrapper[4762]: I1014 13:37:11.789776 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:37:11.801647 master-2 kubenswrapper[4762]: I1014 13:37:11.801595 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:37:11.816730 master-2 kubenswrapper[4762]: I1014 13:37:11.816673 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:37:11.820539 master-2 kubenswrapper[4762]: I1014 13:37:11.820312 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config" (OuterVolumeSpecName: "config") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:37:11.820779 master-2 kubenswrapper[4762]: I1014 13:37:11.820627 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b8400ee-6062-47fd-b45c-bc82fbdc92cd" (UID: "3b8400ee-6062-47fd-b45c-bc82fbdc92cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:37:11.833553 master-2 kubenswrapper[4762]: I1014 13:37:11.832990 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c687bcb45-wwnml"] Oct 14 13:37:11.888213 master-2 kubenswrapper[4762]: I1014 13:37:11.887654 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-sb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:11.888213 master-2 kubenswrapper[4762]: I1014 13:37:11.887695 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:11.888213 master-2 kubenswrapper[4762]: I1014 13:37:11.887718 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-ovsdbserver-nb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:11.888213 master-2 kubenswrapper[4762]: I1014 13:37:11.887732 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:11.888213 master-2 kubenswrapper[4762]: I1014 13:37:11.887777 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8400ee-6062-47fd-b45c-bc82fbdc92cd-dns-swift-storage-0\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:12.443559 master-2 kubenswrapper[4762]: I1014 13:37:12.443339 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" event={"ID":"a7491603-0a38-4539-b476-ba9a70d5ff86","Type":"ContainerStarted","Data":"32a639f764c8eda6a8552e06ff8a450348b7b25759c80c4c3aa627c36e942f36"} Oct 14 13:37:12.443559 master-2 kubenswrapper[4762]: I1014 13:37:12.443411 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" event={"ID":"a7491603-0a38-4539-b476-ba9a70d5ff86","Type":"ContainerStarted","Data":"c726f0c58eea35d891fb2b45c33a93193e7679decf3cf80e1e95cbc620eadac0"} Oct 14 13:37:12.443559 master-2 kubenswrapper[4762]: I1014 13:37:12.443513 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:12.445234 master-2 kubenswrapper[4762]: I1014 13:37:12.445200 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" event={"ID":"3f99144b-486d-4a4b-8b68-294fd70a0f49","Type":"ContainerStarted","Data":"befa664b765526889333b795d5abbd717fd8839173e44a9ce776b63dc8a4e8b1"} Oct 14 13:37:12.445915 master-2 kubenswrapper[4762]: I1014 13:37:12.445689 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:12.448380 master-2 kubenswrapper[4762]: I1014 13:37:12.448080 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" event={"ID":"3b8400ee-6062-47fd-b45c-bc82fbdc92cd","Type":"ContainerDied","Data":"fa5aa75e15b64a833142483e0933d218d662a135a96029d33b71d535304168e5"} Oct 14 13:37:12.448380 master-2 kubenswrapper[4762]: I1014 13:37:12.448125 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787cbbf4dc-r2lgm" Oct 14 13:37:12.448380 master-2 kubenswrapper[4762]: I1014 13:37:12.448125 4762 scope.go:117] "RemoveContainer" containerID="e0f1acf8e74c67b52d78ce1b9f7d41c464bee5c142ef00e5bb5cef8bcd42d9a5" Oct 14 13:37:12.451384 master-2 kubenswrapper[4762]: I1014 13:37:12.451357 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" event={"ID":"b08db4eb-a86d-436c-9251-0326ff980d24","Type":"ContainerStarted","Data":"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347"} Oct 14 13:37:12.451543 master-2 kubenswrapper[4762]: I1014 13:37:12.451504 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" podUID="b08db4eb-a86d-436c-9251-0326ff980d24" containerName="heat-cfnapi" containerID="cri-o://b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347" gracePeriod=60 Oct 14 13:37:12.451730 master-2 kubenswrapper[4762]: I1014 13:37:12.451712 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:37:12.463361 master-2 kubenswrapper[4762]: I1014 13:37:12.463330 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:37:12.464091 master-2 kubenswrapper[4762]: I1014 13:37:12.464027 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerStarted","Data":"50661cab5dbf050256a4ae991fd1529bbef062da60be5c528b06774f4d43f623"} Oct 14 13:37:12.485710 master-2 kubenswrapper[4762]: I1014 13:37:12.483984 4762 scope.go:117] "RemoveContainer" containerID="7f52e6b67cea1c91bd8d3f43ce53c9e240c33f16b685a14076364cdf4dca23da" Oct 14 13:37:12.551404 master-2 kubenswrapper[4762]: I1014 13:37:12.551329 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57bb6bd49-t2th5" Oct 14 13:37:12.563413 master-2 kubenswrapper[4762]: I1014 13:37:12.563334 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" podStartSLOduration=6.563305693 podStartE2EDuration="6.563305693s" podCreationTimestamp="2025-10-14 13:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:37:12.560491996 +0000 UTC m=+1861.804651155" watchObservedRunningTime="2025-10-14 13:37:12.563305693 +0000 UTC m=+1861.807464852" Oct 14 13:37:12.707691 master-2 kubenswrapper[4762]: I1014 13:37:12.707633 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:13.116802 master-2 kubenswrapper[4762]: I1014 13:37:13.116583 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" podStartSLOduration=6.437818913 podStartE2EDuration="15.116530701s" podCreationTimestamp="2025-10-14 13:36:58 +0000 UTC" firstStartedPulling="2025-10-14 13:37:02.611179065 +0000 UTC m=+1851.855338224" lastFinishedPulling="2025-10-14 13:37:11.289890853 +0000 UTC m=+1860.534050012" observedRunningTime="2025-10-14 13:37:12.871097479 +0000 UTC m=+1862.115256658" watchObservedRunningTime="2025-10-14 13:37:13.116530701 +0000 UTC m=+1862.360689860" Oct 14 13:37:13.132821 master-2 kubenswrapper[4762]: I1014 13:37:13.132747 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:37:13.133098 master-2 kubenswrapper[4762]: I1014 13:37:13.133055 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55c4fcb4cb-kwz75" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-api" containerID="cri-o://8374fba6f74a3d2edc7fb9870774183af6268ff6a8cc668f20b43a7d5a867325" gracePeriod=30 Oct 14 13:37:13.133237 master-2 kubenswrapper[4762]: I1014 13:37:13.133131 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55c4fcb4cb-kwz75" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-httpd" containerID="cri-o://9c0744ca71f8d07f85bd04be3920b608ce1b232f7d9fe4f045b0a94ad2f7307d" gracePeriod=30 Oct 14 13:37:13.205675 master-2 kubenswrapper[4762]: I1014 13:37:13.204765 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" podStartSLOduration=7.5891290179999995 podStartE2EDuration="8.204664268s" podCreationTimestamp="2025-10-14 13:37:05 +0000 UTC" firstStartedPulling="2025-10-14 13:37:11.239497112 +0000 UTC m=+1860.483656271" lastFinishedPulling="2025-10-14 13:37:11.855032362 +0000 UTC m=+1861.099191521" observedRunningTime="2025-10-14 13:37:13.179878806 +0000 UTC m=+1862.424037965" watchObservedRunningTime="2025-10-14 13:37:13.204664268 +0000 UTC m=+1862.448823427" Oct 14 13:37:13.476753 master-2 kubenswrapper[4762]: I1014 13:37:13.476681 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerID="befa664b765526889333b795d5abbd717fd8839173e44a9ce776b63dc8a4e8b1" exitCode=1 Oct 14 13:37:13.476753 master-2 kubenswrapper[4762]: I1014 13:37:13.476756 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" event={"ID":"3f99144b-486d-4a4b-8b68-294fd70a0f49","Type":"ContainerDied","Data":"befa664b765526889333b795d5abbd717fd8839173e44a9ce776b63dc8a4e8b1"} Oct 14 13:37:13.477534 master-2 kubenswrapper[4762]: I1014 13:37:13.477467 4762 scope.go:117] "RemoveContainer" containerID="befa664b765526889333b795d5abbd717fd8839173e44a9ce776b63dc8a4e8b1" Oct 14 13:37:13.482596 master-2 kubenswrapper[4762]: I1014 13:37:13.482541 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerStarted","Data":"4483e2ff394db472ccdfdd94b6e6db85355cb447fd2dd7f12b72b93cf3e7f0ca"} Oct 14 13:37:13.484768 master-2 kubenswrapper[4762]: I1014 13:37:13.484724 4762 generic.go:334] "Generic (PLEG): container finished" podID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerID="9c0744ca71f8d07f85bd04be3920b608ce1b232f7d9fe4f045b0a94ad2f7307d" exitCode=0 Oct 14 13:37:13.485676 master-2 kubenswrapper[4762]: I1014 13:37:13.485643 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerDied","Data":"9c0744ca71f8d07f85bd04be3920b608ce1b232f7d9fe4f045b0a94ad2f7307d"} Oct 14 13:37:13.485761 master-2 kubenswrapper[4762]: I1014 13:37:13.485727 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:37:13.827248 master-2 kubenswrapper[4762]: I1014 13:37:13.826256 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:37:13.852243 master-2 kubenswrapper[4762]: I1014 13:37:13.852116 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-46645-api-0" Oct 14 13:37:14.255758 master-2 kubenswrapper[4762]: I1014 13:37:14.255685 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:37:14.391152 master-2 kubenswrapper[4762]: I1014 13:37:14.391080 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-787cbbf4dc-r2lgm"] Oct 14 13:37:14.506091 master-2 kubenswrapper[4762]: I1014 13:37:14.505926 4762 generic.go:334] "Generic (PLEG): container finished" podID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerID="bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a" exitCode=1 Oct 14 13:37:14.506091 master-2 kubenswrapper[4762]: I1014 13:37:14.505997 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" event={"ID":"3f99144b-486d-4a4b-8b68-294fd70a0f49","Type":"ContainerDied","Data":"bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a"} Oct 14 13:37:14.506091 master-2 kubenswrapper[4762]: I1014 13:37:14.506091 4762 scope.go:117] "RemoveContainer" containerID="befa664b765526889333b795d5abbd717fd8839173e44a9ce776b63dc8a4e8b1" Oct 14 13:37:14.507135 master-2 kubenswrapper[4762]: I1014 13:37:14.507094 4762 scope.go:117] "RemoveContainer" containerID="bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a" Oct 14 13:37:14.508122 master-2 kubenswrapper[4762]: E1014 13:37:14.507488 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b955f9d4b-jsrg8_openstack(3f99144b-486d-4a4b-8b68-294fd70a0f49)\"" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" Oct 14 13:37:15.541321 master-2 kubenswrapper[4762]: I1014 13:37:15.539039 4762 scope.go:117] "RemoveContainer" containerID="bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a" Oct 14 13:37:15.541321 master-2 kubenswrapper[4762]: E1014 13:37:15.539669 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b955f9d4b-jsrg8_openstack(3f99144b-486d-4a4b-8b68-294fd70a0f49)\"" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" Oct 14 13:37:15.569295 master-2 kubenswrapper[4762]: I1014 13:37:15.567618 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" path="/var/lib/kubelet/pods/3b8400ee-6062-47fd-b45c-bc82fbdc92cd/volumes" Oct 14 13:37:15.632639 master-2 kubenswrapper[4762]: I1014 13:37:15.632567 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:15.632639 master-2 kubenswrapper[4762]: I1014 13:37:15.632630 4762 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:16.553289 master-2 kubenswrapper[4762]: I1014 13:37:16.553245 4762 scope.go:117] "RemoveContainer" containerID="bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a" Oct 14 13:37:16.553839 master-2 kubenswrapper[4762]: E1014 13:37:16.553712 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b955f9d4b-jsrg8_openstack(3f99144b-486d-4a4b-8b68-294fd70a0f49)\"" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" Oct 14 13:37:16.554396 master-2 kubenswrapper[4762]: I1014 13:37:16.554365 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-central-agent" containerID="cri-o://0984c84818c75f3808fd0e7926c4cf1db985111b51d46ec3860b1623af674422" gracePeriod=30 Oct 14 13:37:16.554517 master-2 kubenswrapper[4762]: I1014 13:37:16.554493 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerStarted","Data":"28ffed1fa88123d233be358677a9faba8643c32fbb7a820dad4783cb54ee40ce"} Oct 14 13:37:16.554594 master-2 kubenswrapper[4762]: I1014 13:37:16.554567 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:37:16.554930 master-2 kubenswrapper[4762]: I1014 13:37:16.554905 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="proxy-httpd" containerID="cri-o://28ffed1fa88123d233be358677a9faba8643c32fbb7a820dad4783cb54ee40ce" gracePeriod=30 Oct 14 13:37:16.555036 master-2 kubenswrapper[4762]: I1014 13:37:16.554985 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="sg-core" containerID="cri-o://4483e2ff394db472ccdfdd94b6e6db85355cb447fd2dd7f12b72b93cf3e7f0ca" gracePeriod=30 Oct 14 13:37:16.555185 master-2 kubenswrapper[4762]: I1014 13:37:16.555070 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-notification-agent" containerID="cri-o://50661cab5dbf050256a4ae991fd1529bbef062da60be5c528b06774f4d43f623" gracePeriod=30 Oct 14 13:37:17.178177 master-2 kubenswrapper[4762]: I1014 13:37:17.178065 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.332932565 podStartE2EDuration="14.178039103s" podCreationTimestamp="2025-10-14 13:37:03 +0000 UTC" firstStartedPulling="2025-10-14 13:37:04.445731989 +0000 UTC m=+1853.689891148" lastFinishedPulling="2025-10-14 13:37:15.290838517 +0000 UTC m=+1864.534997686" observedRunningTime="2025-10-14 13:37:17.152104614 +0000 UTC m=+1866.396263783" watchObservedRunningTime="2025-10-14 13:37:17.178039103 +0000 UTC m=+1866.422198262" Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.568954 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerID="28ffed1fa88123d233be358677a9faba8643c32fbb7a820dad4783cb54ee40ce" exitCode=0 Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569008 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerID="4483e2ff394db472ccdfdd94b6e6db85355cb447fd2dd7f12b72b93cf3e7f0ca" exitCode=2 Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569014 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerDied","Data":"28ffed1fa88123d233be358677a9faba8643c32fbb7a820dad4783cb54ee40ce"} Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569018 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerID="50661cab5dbf050256a4ae991fd1529bbef062da60be5c528b06774f4d43f623" exitCode=0 Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569041 4762 generic.go:334] "Generic (PLEG): container finished" podID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerID="0984c84818c75f3808fd0e7926c4cf1db985111b51d46ec3860b1623af674422" exitCode=0 Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569049 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerDied","Data":"4483e2ff394db472ccdfdd94b6e6db85355cb447fd2dd7f12b72b93cf3e7f0ca"} Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569059 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerDied","Data":"50661cab5dbf050256a4ae991fd1529bbef062da60be5c528b06774f4d43f623"} Oct 14 13:37:17.569090 master-2 kubenswrapper[4762]: I1014 13:37:17.569069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerDied","Data":"0984c84818c75f3808fd0e7926c4cf1db985111b51d46ec3860b1623af674422"} Oct 14 13:37:18.000702 master-2 kubenswrapper[4762]: I1014 13:37:18.000634 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:18.187576 master-2 kubenswrapper[4762]: I1014 13:37:18.187498 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.187576 master-2 kubenswrapper[4762]: I1014 13:37:18.187573 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.187885 master-2 kubenswrapper[4762]: I1014 13:37:18.187630 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.187885 master-2 kubenswrapper[4762]: I1014 13:37:18.187771 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.187885 master-2 kubenswrapper[4762]: I1014 13:37:18.187840 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.188013 master-2 kubenswrapper[4762]: I1014 13:37:18.187881 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.188013 master-2 kubenswrapper[4762]: I1014 13:37:18.187921 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dftsg\" (UniqueName: \"kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg\") pod \"3c4cd0ac-29b3-4797-b65f-d720828842b9\" (UID: \"3c4cd0ac-29b3-4797-b65f-d720828842b9\") " Oct 14 13:37:18.189013 master-2 kubenswrapper[4762]: I1014 13:37:18.188928 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:18.189013 master-2 kubenswrapper[4762]: I1014 13:37:18.188971 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:18.194092 master-2 kubenswrapper[4762]: I1014 13:37:18.194047 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg" (OuterVolumeSpecName: "kube-api-access-dftsg") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "kube-api-access-dftsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:18.195089 master-2 kubenswrapper[4762]: I1014 13:37:18.195015 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts" (OuterVolumeSpecName: "scripts") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:18.226349 master-2 kubenswrapper[4762]: I1014 13:37:18.226244 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:18.291077 master-2 kubenswrapper[4762]: I1014 13:37:18.291030 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.291077 master-2 kubenswrapper[4762]: I1014 13:37:18.291079 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.291415 master-2 kubenswrapper[4762]: I1014 13:37:18.291095 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dftsg\" (UniqueName: \"kubernetes.io/projected/3c4cd0ac-29b3-4797-b65f-d720828842b9-kube-api-access-dftsg\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.291415 master-2 kubenswrapper[4762]: I1014 13:37:18.291110 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c4cd0ac-29b3-4797-b65f-d720828842b9-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.291415 master-2 kubenswrapper[4762]: I1014 13:37:18.291124 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.309477 master-2 kubenswrapper[4762]: I1014 13:37:18.309410 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:18.317819 master-2 kubenswrapper[4762]: I1014 13:37:18.317721 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data" (OuterVolumeSpecName: "config-data") pod "3c4cd0ac-29b3-4797-b65f-d720828842b9" (UID: "3c4cd0ac-29b3-4797-b65f-d720828842b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:18.393119 master-2 kubenswrapper[4762]: I1014 13:37:18.393014 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.393119 master-2 kubenswrapper[4762]: I1014 13:37:18.393086 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4cd0ac-29b3-4797-b65f-d720828842b9-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:18.596105 master-2 kubenswrapper[4762]: I1014 13:37:18.596013 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c4cd0ac-29b3-4797-b65f-d720828842b9","Type":"ContainerDied","Data":"5633215fa77b6696d27ec7a683e6d1b35b1b621d2da3d5afe06485bfd5ea4f2d"} Oct 14 13:37:18.596105 master-2 kubenswrapper[4762]: I1014 13:37:18.596106 4762 scope.go:117] "RemoveContainer" containerID="28ffed1fa88123d233be358677a9faba8643c32fbb7a820dad4783cb54ee40ce" Oct 14 13:37:18.596907 master-2 kubenswrapper[4762]: I1014 13:37:18.596295 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:18.617922 master-2 kubenswrapper[4762]: I1014 13:37:18.617846 4762 scope.go:117] "RemoveContainer" containerID="4483e2ff394db472ccdfdd94b6e6db85355cb447fd2dd7f12b72b93cf3e7f0ca" Oct 14 13:37:18.708335 master-2 kubenswrapper[4762]: I1014 13:37:18.708283 4762 scope.go:117] "RemoveContainer" containerID="50661cab5dbf050256a4ae991fd1529bbef062da60be5c528b06774f4d43f623" Oct 14 13:37:18.727963 master-2 kubenswrapper[4762]: I1014 13:37:18.727914 4762 scope.go:117] "RemoveContainer" containerID="0984c84818c75f3808fd0e7926c4cf1db985111b51d46ec3860b1623af674422" Oct 14 13:37:19.559974 master-2 kubenswrapper[4762]: I1014 13:37:19.559920 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6vt26"] Oct 14 13:37:19.560238 master-2 kubenswrapper[4762]: E1014 13:37:19.560217 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="dnsmasq-dns" Oct 14 13:37:19.560238 master-2 kubenswrapper[4762]: I1014 13:37:19.560234 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="dnsmasq-dns" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: E1014 13:37:19.560251 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-central-agent" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: I1014 13:37:19.560257 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-central-agent" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: E1014 13:37:19.560272 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="sg-core" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: I1014 13:37:19.560278 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="sg-core" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: E1014 13:37:19.560290 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="proxy-httpd" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: I1014 13:37:19.560296 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="proxy-httpd" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: E1014 13:37:19.560303 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-notification-agent" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: I1014 13:37:19.560309 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-notification-agent" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: E1014 13:37:19.560317 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="init" Oct 14 13:37:19.560320 master-2 kubenswrapper[4762]: I1014 13:37:19.560323 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="init" Oct 14 13:37:19.560626 master-2 kubenswrapper[4762]: I1014 13:37:19.560434 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-central-agent" Oct 14 13:37:19.560626 master-2 kubenswrapper[4762]: I1014 13:37:19.560453 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8400ee-6062-47fd-b45c-bc82fbdc92cd" containerName="dnsmasq-dns" Oct 14 13:37:19.560626 master-2 kubenswrapper[4762]: I1014 13:37:19.560463 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="ceilometer-notification-agent" Oct 14 13:37:19.560626 master-2 kubenswrapper[4762]: I1014 13:37:19.560472 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="proxy-httpd" Oct 14 13:37:19.560626 master-2 kubenswrapper[4762]: I1014 13:37:19.560478 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" containerName="sg-core" Oct 14 13:37:19.561086 master-2 kubenswrapper[4762]: I1014 13:37:19.561063 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:19.606517 master-2 kubenswrapper[4762]: I1014 13:37:19.606455 4762 generic.go:334] "Generic (PLEG): container finished" podID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerID="8374fba6f74a3d2edc7fb9870774183af6268ff6a8cc668f20b43a7d5a867325" exitCode=0 Oct 14 13:37:19.606517 master-2 kubenswrapper[4762]: I1014 13:37:19.606508 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerDied","Data":"8374fba6f74a3d2edc7fb9870774183af6268ff6a8cc668f20b43a7d5a867325"} Oct 14 13:37:19.617985 master-2 kubenswrapper[4762]: I1014 13:37:19.617919 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlhc\" (UniqueName: \"kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc\") pod \"nova-cell1-db-create-6vt26\" (UID: \"5987f041-7e28-466c-aa62-8901381b8413\") " pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:19.720558 master-2 kubenswrapper[4762]: I1014 13:37:19.720477 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlhc\" (UniqueName: \"kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc\") pod \"nova-cell1-db-create-6vt26\" (UID: \"5987f041-7e28-466c-aa62-8901381b8413\") " pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:19.732072 master-2 kubenswrapper[4762]: I1014 13:37:19.731936 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6vt26"] Oct 14 13:37:19.834225 master-2 kubenswrapper[4762]: I1014 13:37:19.834056 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:19.854288 master-2 kubenswrapper[4762]: I1014 13:37:19.854204 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:19.864626 master-2 kubenswrapper[4762]: I1014 13:37:19.864550 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:37:19.978308 master-2 kubenswrapper[4762]: I1014 13:37:19.976392 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:19.980606 master-2 kubenswrapper[4762]: I1014 13:37:19.980563 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:19.983430 master-2 kubenswrapper[4762]: I1014 13:37:19.983322 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:37:19.983898 master-2 kubenswrapper[4762]: I1014 13:37:19.983878 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:37:20.002739 master-2 kubenswrapper[4762]: I1014 13:37:20.002678 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:20.040027 master-2 kubenswrapper[4762]: I1014 13:37:20.039922 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040027 master-2 kubenswrapper[4762]: I1014 13:37:20.040034 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040566 master-2 kubenswrapper[4762]: I1014 13:37:20.040069 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040566 master-2 kubenswrapper[4762]: I1014 13:37:20.040107 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040566 master-2 kubenswrapper[4762]: I1014 13:37:20.040135 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040566 master-2 kubenswrapper[4762]: I1014 13:37:20.040216 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.040566 master-2 kubenswrapper[4762]: I1014 13:37:20.040241 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.145928 master-2 kubenswrapper[4762]: I1014 13:37:20.145785 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.145928 master-2 kubenswrapper[4762]: I1014 13:37:20.145871 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.146243 master-2 kubenswrapper[4762]: I1014 13:37:20.145967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.146243 master-2 kubenswrapper[4762]: I1014 13:37:20.145998 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.146243 master-2 kubenswrapper[4762]: I1014 13:37:20.146075 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.146377 master-2 kubenswrapper[4762]: I1014 13:37:20.146317 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.146604 master-2 kubenswrapper[4762]: I1014 13:37:20.146542 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.148647 master-2 kubenswrapper[4762]: I1014 13:37:20.148613 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.151852 master-2 kubenswrapper[4762]: I1014 13:37:20.151813 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.152107 master-2 kubenswrapper[4762]: I1014 13:37:20.152073 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.153082 master-2 kubenswrapper[4762]: I1014 13:37:20.153044 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.153836 master-2 kubenswrapper[4762]: I1014 13:37:20.153774 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.158794 master-2 kubenswrapper[4762]: I1014 13:37:20.158740 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.164646 master-2 kubenswrapper[4762]: I1014 13:37:20.164590 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlhc\" (UniqueName: \"kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc\") pod \"nova-cell1-db-create-6vt26\" (UID: \"5987f041-7e28-466c-aa62-8901381b8413\") " pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:20.175867 master-2 kubenswrapper[4762]: I1014 13:37:20.175785 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:20.177777 master-2 kubenswrapper[4762]: I1014 13:37:20.177733 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc\") pod \"ceilometer-0\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " pod="openstack/ceilometer-0" Oct 14 13:37:20.309238 master-2 kubenswrapper[4762]: I1014 13:37:20.308753 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:20.548719 master-2 kubenswrapper[4762]: I1014 13:37:20.548658 4762 scope.go:117] "RemoveContainer" containerID="8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f" Oct 14 13:37:20.548947 master-2 kubenswrapper[4762]: E1014 13:37:20.548889 4762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c7795fc9c-45w5w_openstack(3664024d-9ed9-48d5-9943-260774564949)\"" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" podUID="3664024d-9ed9-48d5-9943-260774564949" Oct 14 13:37:21.556851 master-2 kubenswrapper[4762]: I1014 13:37:21.556793 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4cd0ac-29b3-4797-b65f-d720828842b9" path="/var/lib/kubelet/pods/3c4cd0ac-29b3-4797-b65f-d720828842b9/volumes" Oct 14 13:37:22.852744 master-2 kubenswrapper[4762]: I1014 13:37:22.852682 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:37:22.906695 master-2 kubenswrapper[4762]: I1014 13:37:22.906651 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs\") pod \"72401abc-aeab-47ed-98d0-15a765c5fb91\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " Oct 14 13:37:22.906830 master-2 kubenswrapper[4762]: I1014 13:37:22.906795 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config\") pod \"72401abc-aeab-47ed-98d0-15a765c5fb91\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " Oct 14 13:37:22.906830 master-2 kubenswrapper[4762]: I1014 13:37:22.906816 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config\") pod \"72401abc-aeab-47ed-98d0-15a765c5fb91\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " Oct 14 13:37:22.906909 master-2 kubenswrapper[4762]: I1014 13:37:22.906839 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrmk\" (UniqueName: \"kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk\") pod \"72401abc-aeab-47ed-98d0-15a765c5fb91\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " Oct 14 13:37:22.906948 master-2 kubenswrapper[4762]: I1014 13:37:22.906909 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle\") pod \"72401abc-aeab-47ed-98d0-15a765c5fb91\" (UID: \"72401abc-aeab-47ed-98d0-15a765c5fb91\") " Oct 14 13:37:22.910166 master-2 kubenswrapper[4762]: I1014 13:37:22.910120 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk" (OuterVolumeSpecName: "kube-api-access-2wrmk") pod "72401abc-aeab-47ed-98d0-15a765c5fb91" (UID: "72401abc-aeab-47ed-98d0-15a765c5fb91"). InnerVolumeSpecName "kube-api-access-2wrmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:22.911590 master-2 kubenswrapper[4762]: I1014 13:37:22.911518 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "72401abc-aeab-47ed-98d0-15a765c5fb91" (UID: "72401abc-aeab-47ed-98d0-15a765c5fb91"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:22.991065 master-2 kubenswrapper[4762]: I1014 13:37:22.988236 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config" (OuterVolumeSpecName: "config") pod "72401abc-aeab-47ed-98d0-15a765c5fb91" (UID: "72401abc-aeab-47ed-98d0-15a765c5fb91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:22.997711 master-2 kubenswrapper[4762]: I1014 13:37:22.997657 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72401abc-aeab-47ed-98d0-15a765c5fb91" (UID: "72401abc-aeab-47ed-98d0-15a765c5fb91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:23.009019 master-2 kubenswrapper[4762]: I1014 13:37:23.008978 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:23.009019 master-2 kubenswrapper[4762]: I1014 13:37:23.009015 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:23.009119 master-2 kubenswrapper[4762]: I1014 13:37:23.009026 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-httpd-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:23.009119 master-2 kubenswrapper[4762]: I1014 13:37:23.009036 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrmk\" (UniqueName: \"kubernetes.io/projected/72401abc-aeab-47ed-98d0-15a765c5fb91-kube-api-access-2wrmk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:23.028501 master-2 kubenswrapper[4762]: I1014 13:37:23.026137 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "72401abc-aeab-47ed-98d0-15a765c5fb91" (UID: "72401abc-aeab-47ed-98d0-15a765c5fb91"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:23.118417 master-2 kubenswrapper[4762]: I1014 13:37:23.117810 4762 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72401abc-aeab-47ed-98d0-15a765c5fb91-ovndb-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:23.354855 master-2 kubenswrapper[4762]: I1014 13:37:23.354801 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:23.363184 master-2 kubenswrapper[4762]: W1014 13:37:23.363102 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5e0faf_632d_4ec4_b2b9_725e14196910.slice/crio-77d5d17cdf70d8c5624b4403fd567db3cad57e283e6714120250a671856891db WatchSource:0}: Error finding container 77d5d17cdf70d8c5624b4403fd567db3cad57e283e6714120250a671856891db: Status 404 returned error can't find the container with id 77d5d17cdf70d8c5624b4403fd567db3cad57e283e6714120250a671856891db Oct 14 13:37:23.414328 master-2 kubenswrapper[4762]: I1014 13:37:23.414289 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6vt26"] Oct 14 13:37:23.419943 master-2 kubenswrapper[4762]: W1014 13:37:23.419902 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5987f041_7e28_466c_aa62_8901381b8413.slice/crio-f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6 WatchSource:0}: Error finding container f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6: Status 404 returned error can't find the container with id f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6 Oct 14 13:37:23.466597 master-2 kubenswrapper[4762]: I1014 13:37:23.465753 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6c687bcb45-wwnml" Oct 14 13:37:23.653064 master-2 kubenswrapper[4762]: I1014 13:37:23.653022 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55c4fcb4cb-kwz75" event={"ID":"72401abc-aeab-47ed-98d0-15a765c5fb91","Type":"ContainerDied","Data":"bbedd81147364d4b5f695d3a3a7f359f7004558927cd86cd501d6152757fc822"} Oct 14 13:37:23.653194 master-2 kubenswrapper[4762]: I1014 13:37:23.653082 4762 scope.go:117] "RemoveContainer" containerID="9c0744ca71f8d07f85bd04be3920b608ce1b232f7d9fe4f045b0a94ad2f7307d" Oct 14 13:37:23.653352 master-2 kubenswrapper[4762]: I1014 13:37:23.653235 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55c4fcb4cb-kwz75" Oct 14 13:37:23.655044 master-2 kubenswrapper[4762]: I1014 13:37:23.654984 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6vt26" event={"ID":"5987f041-7e28-466c-aa62-8901381b8413","Type":"ContainerStarted","Data":"1fb4cc4c07747df4f00ccf2044b34f83fe35dc07577f4c8e9e6deebe643d2cd1"} Oct 14 13:37:23.655138 master-2 kubenswrapper[4762]: I1014 13:37:23.655056 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6vt26" event={"ID":"5987f041-7e28-466c-aa62-8901381b8413","Type":"ContainerStarted","Data":"f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6"} Oct 14 13:37:23.657510 master-2 kubenswrapper[4762]: I1014 13:37:23.657455 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"7c27c6d39f83f2f52d8377ded47b8c3676fd7fdb5632373346de8aa0a95a8345"} Oct 14 13:37:23.658349 master-2 kubenswrapper[4762]: I1014 13:37:23.658319 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerStarted","Data":"77d5d17cdf70d8c5624b4403fd567db3cad57e283e6714120250a671856891db"} Oct 14 13:37:23.727555 master-2 kubenswrapper[4762]: I1014 13:37:23.727496 4762 scope.go:117] "RemoveContainer" containerID="8374fba6f74a3d2edc7fb9870774183af6268ff6a8cc668f20b43a7d5a867325" Oct 14 13:37:24.552001 master-2 kubenswrapper[4762]: I1014 13:37:24.551835 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-6vt26" podStartSLOduration=6.551804429 podStartE2EDuration="6.551804429s" podCreationTimestamp="2025-10-14 13:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:37:24.545577954 +0000 UTC m=+1873.789737113" watchObservedRunningTime="2025-10-14 13:37:24.551804429 +0000 UTC m=+1873.795963588" Oct 14 13:37:24.693830 master-2 kubenswrapper[4762]: I1014 13:37:24.693759 4762 generic.go:334] "Generic (PLEG): container finished" podID="5987f041-7e28-466c-aa62-8901381b8413" containerID="1fb4cc4c07747df4f00ccf2044b34f83fe35dc07577f4c8e9e6deebe643d2cd1" exitCode=0 Oct 14 13:37:24.694497 master-2 kubenswrapper[4762]: I1014 13:37:24.694130 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6vt26" event={"ID":"5987f041-7e28-466c-aa62-8901381b8413","Type":"ContainerDied","Data":"1fb4cc4c07747df4f00ccf2044b34f83fe35dc07577f4c8e9e6deebe643d2cd1"} Oct 14 13:37:24.699189 master-2 kubenswrapper[4762]: I1014 13:37:24.699106 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerStarted","Data":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} Oct 14 13:37:24.747877 master-2 kubenswrapper[4762]: I1014 13:37:24.747804 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:25.142080 master-2 kubenswrapper[4762]: I1014 13:37:25.142032 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:25.163612 master-2 kubenswrapper[4762]: I1014 13:37:25.163524 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom\") pod \"3f99144b-486d-4a4b-8b68-294fd70a0f49\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " Oct 14 13:37:25.163877 master-2 kubenswrapper[4762]: I1014 13:37:25.163692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data\") pod \"3f99144b-486d-4a4b-8b68-294fd70a0f49\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " Oct 14 13:37:25.163877 master-2 kubenswrapper[4762]: I1014 13:37:25.163786 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle\") pod \"3f99144b-486d-4a4b-8b68-294fd70a0f49\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " Oct 14 13:37:25.164138 master-2 kubenswrapper[4762]: I1014 13:37:25.164091 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phbr9\" (UniqueName: \"kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9\") pod \"3f99144b-486d-4a4b-8b68-294fd70a0f49\" (UID: \"3f99144b-486d-4a4b-8b68-294fd70a0f49\") " Oct 14 13:37:25.167966 master-2 kubenswrapper[4762]: I1014 13:37:25.167829 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9" (OuterVolumeSpecName: "kube-api-access-phbr9") pod "3f99144b-486d-4a4b-8b68-294fd70a0f49" (UID: "3f99144b-486d-4a4b-8b68-294fd70a0f49"). InnerVolumeSpecName "kube-api-access-phbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:25.169460 master-2 kubenswrapper[4762]: I1014 13:37:25.169408 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f99144b-486d-4a4b-8b68-294fd70a0f49" (UID: "3f99144b-486d-4a4b-8b68-294fd70a0f49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:25.204478 master-2 kubenswrapper[4762]: I1014 13:37:25.204405 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f99144b-486d-4a4b-8b68-294fd70a0f49" (UID: "3f99144b-486d-4a4b-8b68-294fd70a0f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:25.234928 master-2 kubenswrapper[4762]: I1014 13:37:25.234870 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data" (OuterVolumeSpecName: "config-data") pod "3f99144b-486d-4a4b-8b68-294fd70a0f49" (UID: "3f99144b-486d-4a4b-8b68-294fd70a0f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:25.266390 master-2 kubenswrapper[4762]: I1014 13:37:25.266321 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:25.266390 master-2 kubenswrapper[4762]: I1014 13:37:25.266388 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phbr9\" (UniqueName: \"kubernetes.io/projected/3f99144b-486d-4a4b-8b68-294fd70a0f49-kube-api-access-phbr9\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:25.267488 master-2 kubenswrapper[4762]: I1014 13:37:25.266412 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data-custom\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:25.267488 master-2 kubenswrapper[4762]: I1014 13:37:25.266427 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f99144b-486d-4a4b-8b68-294fd70a0f49-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:25.438211 master-2 kubenswrapper[4762]: I1014 13:37:25.438028 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:37:25.714767 master-2 kubenswrapper[4762]: I1014 13:37:25.714686 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerStarted","Data":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} Oct 14 13:37:25.717067 master-2 kubenswrapper[4762]: I1014 13:37:25.717002 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" event={"ID":"3f99144b-486d-4a4b-8b68-294fd70a0f49","Type":"ContainerDied","Data":"bcc28ada1134263390475214aa27377647298bc742bb11b35c3be3cb6ca56023"} Oct 14 13:37:25.717067 master-2 kubenswrapper[4762]: I1014 13:37:25.717046 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b955f9d4b-jsrg8" Oct 14 13:37:25.717338 master-2 kubenswrapper[4762]: I1014 13:37:25.717091 4762 scope.go:117] "RemoveContainer" containerID="bb6eb9a360b2e405d2f01907a03cb8ddb371221578638deaae72596b50b7b11a" Oct 14 13:37:25.728338 master-2 kubenswrapper[4762]: I1014 13:37:25.725848 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55c4fcb4cb-kwz75"] Oct 14 13:37:26.156031 master-2 kubenswrapper[4762]: I1014 13:37:26.155979 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:26.192333 master-2 kubenswrapper[4762]: I1014 13:37:26.191911 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zlhc\" (UniqueName: \"kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc\") pod \"5987f041-7e28-466c-aa62-8901381b8413\" (UID: \"5987f041-7e28-466c-aa62-8901381b8413\") " Oct 14 13:37:26.198034 master-2 kubenswrapper[4762]: I1014 13:37:26.197922 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc" (OuterVolumeSpecName: "kube-api-access-8zlhc") pod "5987f041-7e28-466c-aa62-8901381b8413" (UID: "5987f041-7e28-466c-aa62-8901381b8413"). InnerVolumeSpecName "kube-api-access-8zlhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:26.295083 master-2 kubenswrapper[4762]: I1014 13:37:26.294955 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zlhc\" (UniqueName: \"kubernetes.io/projected/5987f041-7e28-466c-aa62-8901381b8413-kube-api-access-8zlhc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:26.730135 master-2 kubenswrapper[4762]: I1014 13:37:26.730052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6vt26" event={"ID":"5987f041-7e28-466c-aa62-8901381b8413","Type":"ContainerDied","Data":"f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6"} Oct 14 13:37:26.730135 master-2 kubenswrapper[4762]: I1014 13:37:26.730098 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f34c9b0d1ddaea667754544c2f4d62a117e9628fa0df034c0ea66b79d9111fe6" Oct 14 13:37:26.730135 master-2 kubenswrapper[4762]: I1014 13:37:26.730077 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6vt26" Oct 14 13:37:26.736068 master-2 kubenswrapper[4762]: I1014 13:37:26.735996 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerStarted","Data":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} Oct 14 13:37:27.567402 master-2 kubenswrapper[4762]: I1014 13:37:27.567322 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" path="/var/lib/kubelet/pods/72401abc-aeab-47ed-98d0-15a765c5fb91/volumes" Oct 14 13:37:27.700261 master-2 kubenswrapper[4762]: I1014 13:37:27.698341 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:27.700261 master-2 kubenswrapper[4762]: I1014 13:37:27.698586 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-46645-api-0" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-46645-api-log" containerID="cri-o://a8e56f2e7f8536d8c11eae78a35b2d13020574a622d3418aabf1fc48199d280c" gracePeriod=30 Oct 14 13:37:27.700261 master-2 kubenswrapper[4762]: I1014 13:37:27.698714 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-46645-api-0" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-api" containerID="cri-o://84254745acf9fd64f6f181845e657957564db2cbf5ca52933f4618e0d0dd9383" gracePeriod=30 Oct 14 13:37:27.750672 master-2 kubenswrapper[4762]: I1014 13:37:27.750098 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerStarted","Data":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} Oct 14 13:37:27.751698 master-2 kubenswrapper[4762]: I1014 13:37:27.751232 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:37:27.758724 master-2 kubenswrapper[4762]: I1014 13:37:27.757938 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d4da85-9147-43ff-b96c-81e9d3fffd69" containerID="7c27c6d39f83f2f52d8377ded47b8c3676fd7fdb5632373346de8aa0a95a8345" exitCode=0 Oct 14 13:37:27.758724 master-2 kubenswrapper[4762]: I1014 13:37:27.758011 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerDied","Data":"7c27c6d39f83f2f52d8377ded47b8c3676fd7fdb5632373346de8aa0a95a8345"} Oct 14 13:37:28.772814 master-2 kubenswrapper[4762]: I1014 13:37:28.772718 4762 generic.go:334] "Generic (PLEG): container finished" podID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerID="a8e56f2e7f8536d8c11eae78a35b2d13020574a622d3418aabf1fc48199d280c" exitCode=143 Oct 14 13:37:28.773980 master-2 kubenswrapper[4762]: I1014 13:37:28.773913 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerDied","Data":"a8e56f2e7f8536d8c11eae78a35b2d13020574a622d3418aabf1fc48199d280c"} Oct 14 13:37:30.763672 master-2 kubenswrapper[4762]: I1014 13:37:30.763088 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.696520831 podStartE2EDuration="11.763054501s" podCreationTimestamp="2025-10-14 13:37:19 +0000 UTC" firstStartedPulling="2025-10-14 13:37:23.36698432 +0000 UTC m=+1872.611143479" lastFinishedPulling="2025-10-14 13:37:27.43351796 +0000 UTC m=+1876.677677149" observedRunningTime="2025-10-14 13:37:28.745338737 +0000 UTC m=+1877.989497916" watchObservedRunningTime="2025-10-14 13:37:30.763054501 +0000 UTC m=+1880.007213660" Oct 14 13:37:30.765006 master-2 kubenswrapper[4762]: I1014 13:37:30.764926 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:30.772930 master-2 kubenswrapper[4762]: I1014 13:37:30.772827 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6b955f9d4b-jsrg8"] Oct 14 13:37:31.413451 master-2 kubenswrapper[4762]: I1014 13:37:31.413332 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-46645-api-0" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.129.0.134:8776/healthcheck\": dial tcp 10.129.0.134:8776: connect: connection refused" Oct 14 13:37:31.561474 master-2 kubenswrapper[4762]: I1014 13:37:31.561408 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" path="/var/lib/kubelet/pods/3f99144b-486d-4a4b-8b68-294fd70a0f49/volumes" Oct 14 13:37:31.803268 master-2 kubenswrapper[4762]: I1014 13:37:31.803127 4762 generic.go:334] "Generic (PLEG): container finished" podID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerID="84254745acf9fd64f6f181845e657957564db2cbf5ca52933f4618e0d0dd9383" exitCode=0 Oct 14 13:37:31.803863 master-2 kubenswrapper[4762]: I1014 13:37:31.803472 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerDied","Data":"84254745acf9fd64f6f181845e657957564db2cbf5ca52933f4618e0d0dd9383"} Oct 14 13:37:32.769857 master-2 kubenswrapper[4762]: I1014 13:37:32.769736 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:32.770615 master-2 kubenswrapper[4762]: I1014 13:37:32.770066 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-central-agent" containerID="cri-o://2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" gracePeriod=30 Oct 14 13:37:32.770615 master-2 kubenswrapper[4762]: I1014 13:37:32.770136 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="sg-core" containerID="cri-o://188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" gracePeriod=30 Oct 14 13:37:32.770615 master-2 kubenswrapper[4762]: I1014 13:37:32.770226 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-notification-agent" containerID="cri-o://4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" gracePeriod=30 Oct 14 13:37:32.770615 master-2 kubenswrapper[4762]: I1014 13:37:32.770226 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="proxy-httpd" containerID="cri-o://0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" gracePeriod=30 Oct 14 13:37:32.945531 master-2 kubenswrapper[4762]: I1014 13:37:32.945491 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:37:33.051266 master-2 kubenswrapper[4762]: I1014 13:37:33.050696 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.051415 master-2 kubenswrapper[4762]: I1014 13:37:33.051292 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.051415 master-2 kubenswrapper[4762]: I1014 13:37:33.051404 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.051541 master-2 kubenswrapper[4762]: I1014 13:37:33.051454 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pxx\" (UniqueName: \"kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.052202 master-2 kubenswrapper[4762]: I1014 13:37:33.051671 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.052202 master-2 kubenswrapper[4762]: I1014 13:37:33.051802 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.052202 master-2 kubenswrapper[4762]: I1014 13:37:33.051827 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle\") pod \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\" (UID: \"79e251e7-c9ad-4c0f-b502-5b39d7168ee2\") " Oct 14 13:37:33.052202 master-2 kubenswrapper[4762]: I1014 13:37:33.052109 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs" (OuterVolumeSpecName: "logs") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:33.052599 master-2 kubenswrapper[4762]: I1014 13:37:33.052512 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.052753 master-2 kubenswrapper[4762]: I1014 13:37:33.052714 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 13:37:33.065865 master-2 kubenswrapper[4762]: I1014 13:37:33.065789 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx" (OuterVolumeSpecName: "kube-api-access-q8pxx") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "kube-api-access-q8pxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:33.075826 master-2 kubenswrapper[4762]: I1014 13:37:33.075638 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.087791 master-2 kubenswrapper[4762]: I1014 13:37:33.087724 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts" (OuterVolumeSpecName: "scripts") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.088322 master-2 kubenswrapper[4762]: I1014 13:37:33.088248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.105331 master-2 kubenswrapper[4762]: I1014 13:37:33.105273 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data" (OuterVolumeSpecName: "config-data") pod "79e251e7-c9ad-4c0f-b502-5b39d7168ee2" (UID: "79e251e7-c9ad-4c0f-b502-5b39d7168ee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.153707 master-2 kubenswrapper[4762]: I1014 13:37:33.153653 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.153707 master-2 kubenswrapper[4762]: I1014 13:37:33.153692 4762 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-etc-machine-id\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.153707 master-2 kubenswrapper[4762]: I1014 13:37:33.153704 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pxx\" (UniqueName: \"kubernetes.io/projected/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-kube-api-access-q8pxx\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.153707 master-2 kubenswrapper[4762]: I1014 13:37:33.153714 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data-custom\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.153707 master-2 kubenswrapper[4762]: I1014 13:37:33.153723 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.154036 master-2 kubenswrapper[4762]: I1014 13:37:33.153731 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e251e7-c9ad-4c0f-b502-5b39d7168ee2-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.567973 master-2 kubenswrapper[4762]: I1014 13:37:33.567930 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:33.662800 master-2 kubenswrapper[4762]: I1014 13:37:33.662698 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.663037 master-2 kubenswrapper[4762]: I1014 13:37:33.662839 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.663037 master-2 kubenswrapper[4762]: I1014 13:37:33.662941 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.663037 master-2 kubenswrapper[4762]: I1014 13:37:33.662990 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.663037 master-2 kubenswrapper[4762]: I1014 13:37:33.663024 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.664384 master-2 kubenswrapper[4762]: I1014 13:37:33.663073 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.664384 master-2 kubenswrapper[4762]: I1014 13:37:33.663127 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts\") pod \"be5e0faf-632d-4ec4-b2b9-725e14196910\" (UID: \"be5e0faf-632d-4ec4-b2b9-725e14196910\") " Oct 14 13:37:33.664384 master-2 kubenswrapper[4762]: I1014 13:37:33.663427 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:33.664384 master-2 kubenswrapper[4762]: I1014 13:37:33.663548 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:33.664704 master-2 kubenswrapper[4762]: I1014 13:37:33.664657 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.664704 master-2 kubenswrapper[4762]: I1014 13:37:33.664695 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/be5e0faf-632d-4ec4-b2b9-725e14196910-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.668439 master-2 kubenswrapper[4762]: I1014 13:37:33.668352 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc" (OuterVolumeSpecName: "kube-api-access-vq9pc") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "kube-api-access-vq9pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:33.669054 master-2 kubenswrapper[4762]: I1014 13:37:33.669008 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts" (OuterVolumeSpecName: "scripts") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.686960 master-2 kubenswrapper[4762]: I1014 13:37:33.686887 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.736831 master-2 kubenswrapper[4762]: I1014 13:37:33.736771 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.755696 master-2 kubenswrapper[4762]: I1014 13:37:33.755624 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data" (OuterVolumeSpecName: "config-data") pod "be5e0faf-632d-4ec4-b2b9-725e14196910" (UID: "be5e0faf-632d-4ec4-b2b9-725e14196910"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:33.767410 master-2 kubenswrapper[4762]: I1014 13:37:33.767322 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.767410 master-2 kubenswrapper[4762]: I1014 13:37:33.767358 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.767410 master-2 kubenswrapper[4762]: I1014 13:37:33.767373 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.767410 master-2 kubenswrapper[4762]: I1014 13:37:33.767387 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9pc\" (UniqueName: \"kubernetes.io/projected/be5e0faf-632d-4ec4-b2b9-725e14196910-kube-api-access-vq9pc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.767410 master-2 kubenswrapper[4762]: I1014 13:37:33.767402 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be5e0faf-632d-4ec4-b2b9-725e14196910-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:33.824230 master-2 kubenswrapper[4762]: I1014 13:37:33.822788 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"2120c53f4048f99f1477c2478abbbf6df26150910844b8bebb3bb82702165818"} Oct 14 13:37:33.826949 master-2 kubenswrapper[4762]: I1014 13:37:33.826921 4762 generic.go:334] "Generic (PLEG): container finished" podID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" exitCode=0 Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.826948 4762 generic.go:334] "Generic (PLEG): container finished" podID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" exitCode=2 Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.826958 4762 generic.go:334] "Generic (PLEG): container finished" podID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" exitCode=0 Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.826968 4762 generic.go:334] "Generic (PLEG): container finished" podID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" exitCode=0 Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.827003 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerDied","Data":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.827026 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerDied","Data":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} Oct 14 13:37:33.827037 master-2 kubenswrapper[4762]: I1014 13:37:33.827038 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerDied","Data":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} Oct 14 13:37:33.827325 master-2 kubenswrapper[4762]: I1014 13:37:33.827052 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerDied","Data":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} Oct 14 13:37:33.827325 master-2 kubenswrapper[4762]: I1014 13:37:33.827062 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"be5e0faf-632d-4ec4-b2b9-725e14196910","Type":"ContainerDied","Data":"77d5d17cdf70d8c5624b4403fd567db3cad57e283e6714120250a671856891db"} Oct 14 13:37:33.827325 master-2 kubenswrapper[4762]: I1014 13:37:33.827081 4762 scope.go:117] "RemoveContainer" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.827325 master-2 kubenswrapper[4762]: I1014 13:37:33.827262 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:33.843317 master-2 kubenswrapper[4762]: I1014 13:37:33.840760 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"79e251e7-c9ad-4c0f-b502-5b39d7168ee2","Type":"ContainerDied","Data":"1f2a1cc825c0bc9d817d14278d1d450d401ff127d85874f61dfb2b91a65a0688"} Oct 14 13:37:33.843317 master-2 kubenswrapper[4762]: I1014 13:37:33.841042 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:37:33.876056 master-2 kubenswrapper[4762]: I1014 13:37:33.875973 4762 scope.go:117] "RemoveContainer" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.902088 master-2 kubenswrapper[4762]: I1014 13:37:33.902026 4762 scope.go:117] "RemoveContainer" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.933910 master-2 kubenswrapper[4762]: I1014 13:37:33.933861 4762 scope.go:117] "RemoveContainer" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.957317 master-2 kubenswrapper[4762]: I1014 13:37:33.957253 4762 scope.go:117] "RemoveContainer" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.958542 master-2 kubenswrapper[4762]: E1014 13:37:33.958245 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": container with ID starting with 0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81 not found: ID does not exist" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.958542 master-2 kubenswrapper[4762]: I1014 13:37:33.958324 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} err="failed to get container status \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": rpc error: code = NotFound desc = could not find container \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": container with ID starting with 0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81 not found: ID does not exist" Oct 14 13:37:33.958542 master-2 kubenswrapper[4762]: I1014 13:37:33.958380 4762 scope.go:117] "RemoveContainer" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.958972 master-2 kubenswrapper[4762]: E1014 13:37:33.958915 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": container with ID starting with 188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d not found: ID does not exist" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.959097 master-2 kubenswrapper[4762]: I1014 13:37:33.958966 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} err="failed to get container status \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": rpc error: code = NotFound desc = could not find container \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": container with ID starting with 188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d not found: ID does not exist" Oct 14 13:37:33.959097 master-2 kubenswrapper[4762]: I1014 13:37:33.958999 4762 scope.go:117] "RemoveContainer" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.959601 master-2 kubenswrapper[4762]: E1014 13:37:33.959537 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": container with ID starting with 4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb not found: ID does not exist" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.959777 master-2 kubenswrapper[4762]: I1014 13:37:33.959609 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} err="failed to get container status \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": rpc error: code = NotFound desc = could not find container \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": container with ID starting with 4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb not found: ID does not exist" Oct 14 13:37:33.959777 master-2 kubenswrapper[4762]: I1014 13:37:33.959654 4762 scope.go:117] "RemoveContainer" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.960129 master-2 kubenswrapper[4762]: E1014 13:37:33.959999 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": container with ID starting with 2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9 not found: ID does not exist" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.960129 master-2 kubenswrapper[4762]: I1014 13:37:33.960024 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} err="failed to get container status \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": rpc error: code = NotFound desc = could not find container \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": container with ID starting with 2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9 not found: ID does not exist" Oct 14 13:37:33.960129 master-2 kubenswrapper[4762]: I1014 13:37:33.960039 4762 scope.go:117] "RemoveContainer" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.960478 master-2 kubenswrapper[4762]: I1014 13:37:33.960454 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} err="failed to get container status \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": rpc error: code = NotFound desc = could not find container \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": container with ID starting with 0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81 not found: ID does not exist" Oct 14 13:37:33.960478 master-2 kubenswrapper[4762]: I1014 13:37:33.960473 4762 scope.go:117] "RemoveContainer" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.960748 master-2 kubenswrapper[4762]: I1014 13:37:33.960704 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} err="failed to get container status \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": rpc error: code = NotFound desc = could not find container \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": container with ID starting with 188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d not found: ID does not exist" Oct 14 13:37:33.960748 master-2 kubenswrapper[4762]: I1014 13:37:33.960736 4762 scope.go:117] "RemoveContainer" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.960992 master-2 kubenswrapper[4762]: I1014 13:37:33.960958 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} err="failed to get container status \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": rpc error: code = NotFound desc = could not find container \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": container with ID starting with 4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb not found: ID does not exist" Oct 14 13:37:33.960992 master-2 kubenswrapper[4762]: I1014 13:37:33.960979 4762 scope.go:117] "RemoveContainer" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.961312 master-2 kubenswrapper[4762]: I1014 13:37:33.961274 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} err="failed to get container status \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": rpc error: code = NotFound desc = could not find container \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": container with ID starting with 2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9 not found: ID does not exist" Oct 14 13:37:33.961419 master-2 kubenswrapper[4762]: I1014 13:37:33.961314 4762 scope.go:117] "RemoveContainer" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.961733 master-2 kubenswrapper[4762]: I1014 13:37:33.961611 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} err="failed to get container status \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": rpc error: code = NotFound desc = could not find container \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": container with ID starting with 0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81 not found: ID does not exist" Oct 14 13:37:33.961733 master-2 kubenswrapper[4762]: I1014 13:37:33.961646 4762 scope.go:117] "RemoveContainer" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.962044 master-2 kubenswrapper[4762]: I1014 13:37:33.962014 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} err="failed to get container status \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": rpc error: code = NotFound desc = could not find container \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": container with ID starting with 188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d not found: ID does not exist" Oct 14 13:37:33.962044 master-2 kubenswrapper[4762]: I1014 13:37:33.962039 4762 scope.go:117] "RemoveContainer" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.962554 master-2 kubenswrapper[4762]: I1014 13:37:33.962517 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} err="failed to get container status \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": rpc error: code = NotFound desc = could not find container \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": container with ID starting with 4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb not found: ID does not exist" Oct 14 13:37:33.962670 master-2 kubenswrapper[4762]: I1014 13:37:33.962553 4762 scope.go:117] "RemoveContainer" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.962981 master-2 kubenswrapper[4762]: I1014 13:37:33.962863 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} err="failed to get container status \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": rpc error: code = NotFound desc = could not find container \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": container with ID starting with 2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9 not found: ID does not exist" Oct 14 13:37:33.962981 master-2 kubenswrapper[4762]: I1014 13:37:33.962898 4762 scope.go:117] "RemoveContainer" containerID="0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81" Oct 14 13:37:33.963245 master-2 kubenswrapper[4762]: I1014 13:37:33.963210 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81"} err="failed to get container status \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": rpc error: code = NotFound desc = could not find container \"0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81\": container with ID starting with 0b398321c2ceaec775d328776f10cc35f07c3b23fd338ef46444615dd7cd3a81 not found: ID does not exist" Oct 14 13:37:33.963360 master-2 kubenswrapper[4762]: I1014 13:37:33.963243 4762 scope.go:117] "RemoveContainer" containerID="188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d" Oct 14 13:37:33.963847 master-2 kubenswrapper[4762]: I1014 13:37:33.963818 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d"} err="failed to get container status \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": rpc error: code = NotFound desc = could not find container \"188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d\": container with ID starting with 188a0feea6d959d28ff9d913ee643096ca7faf6709757960faf9d6d118354b8d not found: ID does not exist" Oct 14 13:37:33.963847 master-2 kubenswrapper[4762]: I1014 13:37:33.963841 4762 scope.go:117] "RemoveContainer" containerID="4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb" Oct 14 13:37:33.964317 master-2 kubenswrapper[4762]: I1014 13:37:33.964123 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb"} err="failed to get container status \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": rpc error: code = NotFound desc = could not find container \"4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb\": container with ID starting with 4c000b0f5df01d424f3e285ead31550cad573d57422786c9a5d9e86ae39043fb not found: ID does not exist" Oct 14 13:37:33.964317 master-2 kubenswrapper[4762]: I1014 13:37:33.964178 4762 scope.go:117] "RemoveContainer" containerID="2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9" Oct 14 13:37:33.964746 master-2 kubenswrapper[4762]: I1014 13:37:33.964717 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9"} err="failed to get container status \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": rpc error: code = NotFound desc = could not find container \"2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9\": container with ID starting with 2b5b90dd8628826ed0b650eb953ef6222d90ffad0b46ace3798dcbe98218eaa9 not found: ID does not exist" Oct 14 13:37:33.964746 master-2 kubenswrapper[4762]: I1014 13:37:33.964740 4762 scope.go:117] "RemoveContainer" containerID="84254745acf9fd64f6f181845e657957564db2cbf5ca52933f4618e0d0dd9383" Oct 14 13:37:33.986127 master-2 kubenswrapper[4762]: I1014 13:37:33.986087 4762 scope.go:117] "RemoveContainer" containerID="a8e56f2e7f8536d8c11eae78a35b2d13020574a622d3418aabf1fc48199d280c" Oct 14 13:37:34.977279 master-2 kubenswrapper[4762]: I1014 13:37:34.973105 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:34.977279 master-2 kubenswrapper[4762]: I1014 13:37:34.973408 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-46645-default-external-api-1" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-log" containerID="cri-o://4ba58aeb6edcffa7e6218d4ba3fcc7dbaa369b3d5caea858ea267cd28a0f198e" gracePeriod=30 Oct 14 13:37:34.977279 master-2 kubenswrapper[4762]: I1014 13:37:34.973571 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-46645-default-external-api-1" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-httpd" containerID="cri-o://7ed44c687a7980777262a5379a608149b92db276eaf5e68065ae60ccb661da2a" gracePeriod=30 Oct 14 13:37:35.549126 master-2 kubenswrapper[4762]: I1014 13:37:35.549089 4762 scope.go:117] "RemoveContainer" containerID="8993a79cfedb844ceff8ab779453b078ad7c7447950e766745aeda08206b595f" Oct 14 13:37:35.893126 master-2 kubenswrapper[4762]: I1014 13:37:35.892978 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" event={"ID":"3664024d-9ed9-48d5-9943-260774564949","Type":"ContainerStarted","Data":"7e529ac43607c7e230d80e3071b23a2259faae8e6173164aa8f0a1862c60cb1f"} Oct 14 13:37:35.893399 master-2 kubenswrapper[4762]: I1014 13:37:35.893347 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:37:35.896214 master-2 kubenswrapper[4762]: I1014 13:37:35.896178 4762 generic.go:334] "Generic (PLEG): container finished" podID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerID="4ba58aeb6edcffa7e6218d4ba3fcc7dbaa369b3d5caea858ea267cd28a0f198e" exitCode=143 Oct 14 13:37:35.896392 master-2 kubenswrapper[4762]: I1014 13:37:35.896275 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerDied","Data":"4ba58aeb6edcffa7e6218d4ba3fcc7dbaa369b3d5caea858ea267cd28a0f198e"} Oct 14 13:37:36.680572 master-2 kubenswrapper[4762]: I1014 13:37:36.680427 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:36.840028 master-2 kubenswrapper[4762]: I1014 13:37:36.839936 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:37.558698 master-2 kubenswrapper[4762]: I1014 13:37:37.558588 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" path="/var/lib/kubelet/pods/79e251e7-c9ad-4c0f-b502-5b39d7168ee2/volumes" Oct 14 13:37:37.796326 master-2 kubenswrapper[4762]: I1014 13:37:37.796208 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:37.984360 master-2 kubenswrapper[4762]: I1014 13:37:37.984188 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984481 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5987f041-7e28-466c-aa62-8901381b8413" containerName="mariadb-database-create" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984494 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5987f041-7e28-466c-aa62-8901381b8413" containerName="mariadb-database-create" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984507 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-httpd" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984513 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-httpd" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984525 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="sg-core" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984531 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="sg-core" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984543 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-central-agent" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984550 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-central-agent" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984561 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-46645-api-log" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984566 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-46645-api-log" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984580 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="proxy-httpd" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984585 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="proxy-httpd" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984595 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-api" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984600 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-api" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984610 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-api" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984615 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-api" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984626 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984632 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984641 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-notification-agent" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984647 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-notification-agent" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: E1014 13:37:37.984662 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.984674 master-2 kubenswrapper[4762]: I1014 13:37:37.984668 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984776 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-api" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984791 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-notification-agent" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984803 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="ceilometer-central-agent" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984810 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="72401abc-aeab-47ed-98d0-15a765c5fb91" containerName="neutron-httpd" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984818 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-api" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984828 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e251e7-c9ad-4c0f-b502-5b39d7168ee2" containerName="cinder-46645-api-log" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984837 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984845 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="sg-core" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984852 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5987f041-7e28-466c-aa62-8901381b8413" containerName="mariadb-database-create" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984861 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f99144b-486d-4a4b-8b68-294fd70a0f49" containerName="heat-cfnapi" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.984870 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" containerName="proxy-httpd" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.985835 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.988694 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-api-config-data" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.988762 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-config-data" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.988934 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.989068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 14 13:37:37.995874 master-2 kubenswrapper[4762]: I1014 13:37:37.989484 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-46645-scripts" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050235 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a991008-7930-4790-87a3-300d77e9186a-logs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050433 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-scripts\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050514 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-public-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050739 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050857 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a991008-7930-4790-87a3-300d77e9186a-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050920 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.050993 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffd2x\" (UniqueName: \"kubernetes.io/projected/8a991008-7930-4790-87a3-300d77e9186a-kube-api-access-ffd2x\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.051096 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.054333 master-2 kubenswrapper[4762]: I1014 13:37:38.051145 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-internal-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153697 master-2 kubenswrapper[4762]: I1014 13:37:38.153632 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-scripts\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153925 master-2 kubenswrapper[4762]: I1014 13:37:38.153708 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-public-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153925 master-2 kubenswrapper[4762]: I1014 13:37:38.153777 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153925 master-2 kubenswrapper[4762]: I1014 13:37:38.153816 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a991008-7930-4790-87a3-300d77e9186a-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153925 master-2 kubenswrapper[4762]: I1014 13:37:38.153849 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.153925 master-2 kubenswrapper[4762]: I1014 13:37:38.153877 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffd2x\" (UniqueName: \"kubernetes.io/projected/8a991008-7930-4790-87a3-300d77e9186a-kube-api-access-ffd2x\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.154085 master-2 kubenswrapper[4762]: I1014 13:37:38.153924 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.154085 master-2 kubenswrapper[4762]: I1014 13:37:38.153950 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-internal-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.154085 master-2 kubenswrapper[4762]: I1014 13:37:38.153981 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a991008-7930-4790-87a3-300d77e9186a-etc-machine-id\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.154085 master-2 kubenswrapper[4762]: I1014 13:37:38.153997 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a991008-7930-4790-87a3-300d77e9186a-logs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.154653 master-2 kubenswrapper[4762]: I1014 13:37:38.154597 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a991008-7930-4790-87a3-300d77e9186a-logs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.157620 master-2 kubenswrapper[4762]: I1014 13:37:38.157584 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.157772 master-2 kubenswrapper[4762]: I1014 13:37:38.157719 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-public-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.158299 master-2 kubenswrapper[4762]: I1014 13:37:38.158278 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-combined-ca-bundle\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.158794 master-2 kubenswrapper[4762]: I1014 13:37:38.158713 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-config-data-custom\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.158973 master-2 kubenswrapper[4762]: I1014 13:37:38.158945 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-internal-tls-certs\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.160759 master-2 kubenswrapper[4762]: I1014 13:37:38.160707 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a991008-7930-4790-87a3-300d77e9186a-scripts\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:38.803746 master-2 kubenswrapper[4762]: I1014 13:37:38.803673 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-551b-account-create-2jk4f"] Oct 14 13:37:38.805536 master-2 kubenswrapper[4762]: I1014 13:37:38.805504 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:38.811108 master-2 kubenswrapper[4762]: I1014 13:37:38.811061 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 14 13:37:38.812565 master-2 kubenswrapper[4762]: I1014 13:37:38.812515 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:38.856309 master-2 kubenswrapper[4762]: I1014 13:37:38.856117 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:38.865179 master-2 kubenswrapper[4762]: I1014 13:37:38.864367 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-551b-account-create-2jk4f"] Oct 14 13:37:38.867004 master-2 kubenswrapper[4762]: I1014 13:37:38.866947 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86kz\" (UniqueName: \"kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz\") pod \"nova-api-551b-account-create-2jk4f\" (UID: \"6ada32c3-34c6-47a3-858e-7e92754e719b\") " pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:38.934416 master-2 kubenswrapper[4762]: I1014 13:37:38.934366 4762 generic.go:334] "Generic (PLEG): container finished" podID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerID="7ed44c687a7980777262a5379a608149b92db276eaf5e68065ae60ccb661da2a" exitCode=0 Oct 14 13:37:38.934633 master-2 kubenswrapper[4762]: I1014 13:37:38.934414 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerDied","Data":"7ed44c687a7980777262a5379a608149b92db276eaf5e68065ae60ccb661da2a"} Oct 14 13:37:38.968918 master-2 kubenswrapper[4762]: I1014 13:37:38.968826 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86kz\" (UniqueName: \"kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz\") pod \"nova-api-551b-account-create-2jk4f\" (UID: \"6ada32c3-34c6-47a3-858e-7e92754e719b\") " pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:39.001148 master-2 kubenswrapper[4762]: I1014 13:37:39.001069 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86kz\" (UniqueName: \"kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz\") pod \"nova-api-551b-account-create-2jk4f\" (UID: \"6ada32c3-34c6-47a3-858e-7e92754e719b\") " pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:39.017855 master-2 kubenswrapper[4762]: I1014 13:37:39.017799 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-46645-default-external-api-1" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-log" probeResult="failure" output="Get \"http://10.129.0.136:9292/healthcheck\": dial tcp 10.129.0.136:9292: connect: connection refused" Oct 14 13:37:39.018075 master-2 kubenswrapper[4762]: I1014 13:37:39.017800 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-46645-default-external-api-1" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.129.0.136:9292/healthcheck\": dial tcp 10.129.0.136:9292: connect: connection refused" Oct 14 13:37:39.132531 master-2 kubenswrapper[4762]: I1014 13:37:39.132408 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:39.320504 master-2 kubenswrapper[4762]: I1014 13:37:39.319959 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:39.327118 master-2 kubenswrapper[4762]: I1014 13:37:39.325106 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:39.331627 master-2 kubenswrapper[4762]: I1014 13:37:39.331013 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:37:39.332591 master-2 kubenswrapper[4762]: I1014 13:37:39.332431 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:37:39.344560 master-2 kubenswrapper[4762]: I1014 13:37:39.343450 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:39.378817 master-2 kubenswrapper[4762]: I1014 13:37:39.378717 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379206 master-2 kubenswrapper[4762]: I1014 13:37:39.378936 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379206 master-2 kubenswrapper[4762]: I1014 13:37:39.379081 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379318 master-2 kubenswrapper[4762]: I1014 13:37:39.379243 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379427 master-2 kubenswrapper[4762]: I1014 13:37:39.379389 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379661 master-2 kubenswrapper[4762]: I1014 13:37:39.379598 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.379736 master-2 kubenswrapper[4762]: I1014 13:37:39.379676 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2jf\" (UniqueName: \"kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482586 master-2 kubenswrapper[4762]: I1014 13:37:39.482492 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482586 master-2 kubenswrapper[4762]: I1014 13:37:39.482596 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2jf\" (UniqueName: \"kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482982 master-2 kubenswrapper[4762]: I1014 13:37:39.482680 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482982 master-2 kubenswrapper[4762]: I1014 13:37:39.482706 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482982 master-2 kubenswrapper[4762]: I1014 13:37:39.482733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482982 master-2 kubenswrapper[4762]: I1014 13:37:39.482767 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.482982 master-2 kubenswrapper[4762]: I1014 13:37:39.482809 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.483849 master-2 kubenswrapper[4762]: I1014 13:37:39.483816 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.484359 master-2 kubenswrapper[4762]: I1014 13:37:39.484291 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.488241 master-2 kubenswrapper[4762]: I1014 13:37:39.488019 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.488241 master-2 kubenswrapper[4762]: I1014 13:37:39.488053 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.491624 master-2 kubenswrapper[4762]: I1014 13:37:39.491539 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.492178 master-2 kubenswrapper[4762]: I1014 13:37:39.492046 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.508052 master-2 kubenswrapper[4762]: I1014 13:37:39.508003 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2jf\" (UniqueName: \"kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf\") pod \"ceilometer-0\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " pod="openstack/ceilometer-0" Oct 14 13:37:39.568834 master-2 kubenswrapper[4762]: I1014 13:37:39.568751 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5e0faf-632d-4ec4-b2b9-725e14196910" path="/var/lib/kubelet/pods/be5e0faf-632d-4ec4-b2b9-725e14196910/volumes" Oct 14 13:37:39.648084 master-2 kubenswrapper[4762]: I1014 13:37:39.648001 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:39.800431 master-2 kubenswrapper[4762]: I1014 13:37:39.800369 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffd2x\" (UniqueName: \"kubernetes.io/projected/8a991008-7930-4790-87a3-300d77e9186a-kube-api-access-ffd2x\") pod \"cinder-46645-api-0\" (UID: \"8a991008-7930-4790-87a3-300d77e9186a\") " pod="openstack/cinder-46645-api-0" Oct 14 13:37:39.815240 master-2 kubenswrapper[4762]: I1014 13:37:39.815102 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-46645-api-0" Oct 14 13:37:40.255084 master-2 kubenswrapper[4762]: I1014 13:37:40.255022 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-551b-account-create-2jk4f"] Oct 14 13:37:40.489706 master-2 kubenswrapper[4762]: I1014 13:37:40.489619 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-46645-api-0"] Oct 14 13:37:40.527294 master-2 kubenswrapper[4762]: W1014 13:37:40.527243 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a991008_7930_4790_87a3_300d77e9186a.slice/crio-a52ed27a25932ad6265f33fb30c26b186b089b6f11466ad3cf91bc7cdf4d161c WatchSource:0}: Error finding container a52ed27a25932ad6265f33fb30c26b186b089b6f11466ad3cf91bc7cdf4d161c: Status 404 returned error can't find the container with id a52ed27a25932ad6265f33fb30c26b186b089b6f11466ad3cf91bc7cdf4d161c Oct 14 13:37:40.687516 master-2 kubenswrapper[4762]: I1014 13:37:40.687428 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:40.822284 master-2 kubenswrapper[4762]: I1014 13:37:40.822239 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:40.909632 master-2 kubenswrapper[4762]: I1014 13:37:40.909563 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909632 master-2 kubenswrapper[4762]: I1014 13:37:40.909614 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6lmt\" (UniqueName: \"kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909917 master-2 kubenswrapper[4762]: I1014 13:37:40.909694 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909917 master-2 kubenswrapper[4762]: I1014 13:37:40.909748 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909917 master-2 kubenswrapper[4762]: I1014 13:37:40.909826 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909917 master-2 kubenswrapper[4762]: I1014 13:37:40.909847 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.909917 master-2 kubenswrapper[4762]: I1014 13:37:40.909902 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data\") pod \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\" (UID: \"bd1a658e-0a8c-416a-9624-f80c6fbacde7\") " Oct 14 13:37:40.910812 master-2 kubenswrapper[4762]: I1014 13:37:40.910752 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:40.910812 master-2 kubenswrapper[4762]: I1014 13:37:40.910770 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs" (OuterVolumeSpecName: "logs") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:40.914345 master-2 kubenswrapper[4762]: I1014 13:37:40.914311 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts" (OuterVolumeSpecName: "scripts") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:40.915098 master-2 kubenswrapper[4762]: I1014 13:37:40.915044 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt" (OuterVolumeSpecName: "kube-api-access-b6lmt") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "kube-api-access-b6lmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:40.933924 master-2 kubenswrapper[4762]: I1014 13:37:40.933822 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:40.934872 master-2 kubenswrapper[4762]: I1014 13:37:40.934817 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1" (OuterVolumeSpecName: "glance") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 14 13:37:40.945571 master-2 kubenswrapper[4762]: I1014 13:37:40.945521 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data" (OuterVolumeSpecName: "config-data") pod "bd1a658e-0a8c-416a-9624-f80c6fbacde7" (UID: "bd1a658e-0a8c-416a-9624-f80c6fbacde7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:40.951302 master-2 kubenswrapper[4762]: I1014 13:37:40.951244 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"8a991008-7930-4790-87a3-300d77e9186a","Type":"ContainerStarted","Data":"a52ed27a25932ad6265f33fb30c26b186b089b6f11466ad3cf91bc7cdf4d161c"} Oct 14 13:37:40.953527 master-2 kubenswrapper[4762]: I1014 13:37:40.953491 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-551b-account-create-2jk4f" event={"ID":"6ada32c3-34c6-47a3-858e-7e92754e719b","Type":"ContainerStarted","Data":"2e81ae49f5c80e146293c83b36f208f31b6ffb3da4ca7fe8b58aa1fb9d5fd979"} Oct 14 13:37:40.953568 master-2 kubenswrapper[4762]: I1014 13:37:40.953527 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-551b-account-create-2jk4f" event={"ID":"6ada32c3-34c6-47a3-858e-7e92754e719b","Type":"ContainerStarted","Data":"154c25d80192102627558c15bf7e0e32cc5d051f20ea56180aceff1191b99b09"} Oct 14 13:37:40.956118 master-2 kubenswrapper[4762]: I1014 13:37:40.956086 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"bd1a658e-0a8c-416a-9624-f80c6fbacde7","Type":"ContainerDied","Data":"a370eba9411b5d8bb312cfd8f3a4801fff3be55042bf68f488bf338a5164dcd0"} Oct 14 13:37:40.956176 master-2 kubenswrapper[4762]: I1014 13:37:40.956127 4762 scope.go:117] "RemoveContainer" containerID="7ed44c687a7980777262a5379a608149b92db276eaf5e68065ae60ccb661da2a" Oct 14 13:37:40.956310 master-2 kubenswrapper[4762]: I1014 13:37:40.956283 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:40.980102 master-2 kubenswrapper[4762]: I1014 13:37:40.980053 4762 scope.go:117] "RemoveContainer" containerID="4ba58aeb6edcffa7e6218d4ba3fcc7dbaa369b3d5caea858ea267cd28a0f198e" Oct 14 13:37:41.011647 master-2 kubenswrapper[4762]: I1014 13:37:41.011583 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") on node \"master-2\" " Oct 14 13:37:41.011647 master-2 kubenswrapper[4762]: I1014 13:37:41.011633 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6lmt\" (UniqueName: \"kubernetes.io/projected/bd1a658e-0a8c-416a-9624-f80c6fbacde7-kube-api-access-b6lmt\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.011647 master-2 kubenswrapper[4762]: I1014 13:37:41.011644 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.011647 master-2 kubenswrapper[4762]: I1014 13:37:41.011654 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.011647 master-2 kubenswrapper[4762]: I1014 13:37:41.011663 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.012011 master-2 kubenswrapper[4762]: I1014 13:37:41.011674 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd1a658e-0a8c-416a-9624-f80c6fbacde7-httpd-run\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.012011 master-2 kubenswrapper[4762]: I1014 13:37:41.011690 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd1a658e-0a8c-416a-9624-f80c6fbacde7-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.037242 master-2 kubenswrapper[4762]: I1014 13:37:41.037209 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 14 13:37:41.037425 master-2 kubenswrapper[4762]: I1014 13:37:41.037407 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73" (UniqueName: "kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1") on node "master-2" Oct 14 13:37:41.113657 master-2 kubenswrapper[4762]: I1014 13:37:41.113575 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:41.757817 master-2 kubenswrapper[4762]: I1014 13:37:41.757746 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-c7795fc9c-45w5w" Oct 14 13:37:41.965983 master-2 kubenswrapper[4762]: I1014 13:37:41.965903 4762 generic.go:334] "Generic (PLEG): container finished" podID="6ada32c3-34c6-47a3-858e-7e92754e719b" containerID="2e81ae49f5c80e146293c83b36f208f31b6ffb3da4ca7fe8b58aa1fb9d5fd979" exitCode=0 Oct 14 13:37:41.966590 master-2 kubenswrapper[4762]: I1014 13:37:41.965994 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-551b-account-create-2jk4f" event={"ID":"6ada32c3-34c6-47a3-858e-7e92754e719b","Type":"ContainerDied","Data":"2e81ae49f5c80e146293c83b36f208f31b6ffb3da4ca7fe8b58aa1fb9d5fd979"} Oct 14 13:37:41.968021 master-2 kubenswrapper[4762]: I1014 13:37:41.967962 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerStarted","Data":"32efb2047f017b262faca1e25c3807e2e32b7017ce682151a0c48cce798795a3"} Oct 14 13:37:41.973773 master-2 kubenswrapper[4762]: I1014 13:37:41.973704 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"8a991008-7930-4790-87a3-300d77e9186a","Type":"ContainerStarted","Data":"4c604d831a4282ded0019936fc313980dec71d7687a6b480986b9bfc7069e0a5"} Oct 14 13:37:41.973773 master-2 kubenswrapper[4762]: I1014 13:37:41.973766 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-46645-api-0" event={"ID":"8a991008-7930-4790-87a3-300d77e9186a","Type":"ContainerStarted","Data":"973ffab23d0e33025985cd22dccf3418d148029beff05394b83f73358b663b3b"} Oct 14 13:37:41.974040 master-2 kubenswrapper[4762]: I1014 13:37:41.973970 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-46645-api-0" Oct 14 13:37:42.131939 master-2 kubenswrapper[4762]: I1014 13:37:42.131816 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3163-account-create-lprsc"] Oct 14 13:37:42.132244 master-2 kubenswrapper[4762]: E1014 13:37:42.132213 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-log" Oct 14 13:37:42.132244 master-2 kubenswrapper[4762]: I1014 13:37:42.132242 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-log" Oct 14 13:37:42.132381 master-2 kubenswrapper[4762]: E1014 13:37:42.132277 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-httpd" Oct 14 13:37:42.132381 master-2 kubenswrapper[4762]: I1014 13:37:42.132287 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-httpd" Oct 14 13:37:42.133453 master-2 kubenswrapper[4762]: I1014 13:37:42.132469 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-log" Oct 14 13:37:42.133453 master-2 kubenswrapper[4762]: I1014 13:37:42.132492 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" containerName="glance-httpd" Oct 14 13:37:42.133453 master-2 kubenswrapper[4762]: I1014 13:37:42.133292 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:42.140336 master-2 kubenswrapper[4762]: I1014 13:37:42.140243 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 14 13:37:42.149095 master-2 kubenswrapper[4762]: I1014 13:37:42.148712 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3163-account-create-lprsc"] Oct 14 13:37:42.234366 master-2 kubenswrapper[4762]: I1014 13:37:42.234308 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czz6v\" (UniqueName: \"kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v\") pod \"nova-cell0-3163-account-create-lprsc\" (UID: \"17c406ad-d002-46f3-9014-31258d21a113\") " pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:42.336778 master-2 kubenswrapper[4762]: I1014 13:37:42.336733 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czz6v\" (UniqueName: \"kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v\") pod \"nova-cell0-3163-account-create-lprsc\" (UID: \"17c406ad-d002-46f3-9014-31258d21a113\") " pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:42.380177 master-2 kubenswrapper[4762]: I1014 13:37:42.376588 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czz6v\" (UniqueName: \"kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v\") pod \"nova-cell0-3163-account-create-lprsc\" (UID: \"17c406ad-d002-46f3-9014-31258d21a113\") " pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:42.453237 master-2 kubenswrapper[4762]: I1014 13:37:42.453124 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:42.910792 master-2 kubenswrapper[4762]: W1014 13:37:42.910726 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17c406ad_d002_46f3_9014_31258d21a113.slice/crio-a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147 WatchSource:0}: Error finding container a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147: Status 404 returned error can't find the container with id a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147 Oct 14 13:37:42.913507 master-2 kubenswrapper[4762]: I1014 13:37:42.913444 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3163-account-create-lprsc"] Oct 14 13:37:42.987235 master-2 kubenswrapper[4762]: I1014 13:37:42.987150 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerStarted","Data":"15d34e6e2593e74b02cb16e87c18fef2ec406d98aeebcf4017b39cb8ad0093a4"} Oct 14 13:37:42.988593 master-2 kubenswrapper[4762]: I1014 13:37:42.988538 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3163-account-create-lprsc" event={"ID":"17c406ad-d002-46f3-9014-31258d21a113","Type":"ContainerStarted","Data":"a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147"} Oct 14 13:37:43.285462 master-2 kubenswrapper[4762]: I1014 13:37:43.285415 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:43.358759 master-2 kubenswrapper[4762]: I1014 13:37:43.358675 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86kz\" (UniqueName: \"kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz\") pod \"6ada32c3-34c6-47a3-858e-7e92754e719b\" (UID: \"6ada32c3-34c6-47a3-858e-7e92754e719b\") " Oct 14 13:37:43.366426 master-2 kubenswrapper[4762]: I1014 13:37:43.366377 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz" (OuterVolumeSpecName: "kube-api-access-t86kz") pod "6ada32c3-34c6-47a3-858e-7e92754e719b" (UID: "6ada32c3-34c6-47a3-858e-7e92754e719b"). InnerVolumeSpecName "kube-api-access-t86kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:43.429035 master-2 kubenswrapper[4762]: I1014 13:37:43.428847 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3359-account-create-c7gkw"] Oct 14 13:37:43.430968 master-2 kubenswrapper[4762]: E1014 13:37:43.430941 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ada32c3-34c6-47a3-858e-7e92754e719b" containerName="mariadb-account-create" Oct 14 13:37:43.430968 master-2 kubenswrapper[4762]: I1014 13:37:43.430967 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ada32c3-34c6-47a3-858e-7e92754e719b" containerName="mariadb-account-create" Oct 14 13:37:43.432707 master-2 kubenswrapper[4762]: I1014 13:37:43.431286 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ada32c3-34c6-47a3-858e-7e92754e719b" containerName="mariadb-account-create" Oct 14 13:37:43.432707 master-2 kubenswrapper[4762]: I1014 13:37:43.432398 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:43.440627 master-2 kubenswrapper[4762]: I1014 13:37:43.439477 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 14 13:37:43.452379 master-2 kubenswrapper[4762]: I1014 13:37:43.452212 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3359-account-create-c7gkw"] Oct 14 13:37:43.462436 master-2 kubenswrapper[4762]: I1014 13:37:43.462367 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t86kz\" (UniqueName: \"kubernetes.io/projected/6ada32c3-34c6-47a3-858e-7e92754e719b-kube-api-access-t86kz\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:43.564016 master-2 kubenswrapper[4762]: I1014 13:37:43.563957 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnmx\" (UniqueName: \"kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx\") pod \"nova-cell1-3359-account-create-c7gkw\" (UID: \"b20eee57-43a0-48ae-be88-57727baa5e8d\") " pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:43.665143 master-2 kubenswrapper[4762]: I1014 13:37:43.665081 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnmx\" (UniqueName: \"kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx\") pod \"nova-cell1-3359-account-create-c7gkw\" (UID: \"b20eee57-43a0-48ae-be88-57727baa5e8d\") " pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:43.699082 master-2 kubenswrapper[4762]: I1014 13:37:43.698877 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnmx\" (UniqueName: \"kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx\") pod \"nova-cell1-3359-account-create-c7gkw\" (UID: \"b20eee57-43a0-48ae-be88-57727baa5e8d\") " pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:43.763017 master-2 kubenswrapper[4762]: I1014 13:37:43.762954 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:43.990674 master-2 kubenswrapper[4762]: I1014 13:37:43.989921 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-551b-account-create-2jk4f" podStartSLOduration=6.989888753 podStartE2EDuration="6.989888753s" podCreationTimestamp="2025-10-14 13:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:37:43.91282488 +0000 UTC m=+1893.156984039" watchObservedRunningTime="2025-10-14 13:37:43.989888753 +0000 UTC m=+1893.234047912" Oct 14 13:37:44.012140 master-2 kubenswrapper[4762]: I1014 13:37:44.012045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-551b-account-create-2jk4f" event={"ID":"6ada32c3-34c6-47a3-858e-7e92754e719b","Type":"ContainerDied","Data":"154c25d80192102627558c15bf7e0e32cc5d051f20ea56180aceff1191b99b09"} Oct 14 13:37:44.012140 master-2 kubenswrapper[4762]: I1014 13:37:44.012116 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="154c25d80192102627558c15bf7e0e32cc5d051f20ea56180aceff1191b99b09" Oct 14 13:37:44.012547 master-2 kubenswrapper[4762]: I1014 13:37:44.012208 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-551b-account-create-2jk4f" Oct 14 13:37:44.016741 master-2 kubenswrapper[4762]: I1014 13:37:44.016683 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerStarted","Data":"a1c150c934280ccddcfb786a87918482550e6fd376a2add163d6f263083ba00f"} Oct 14 13:37:44.016741 master-2 kubenswrapper[4762]: I1014 13:37:44.016741 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerStarted","Data":"98002b6c7dc3c9fed58f7c1e1ee7bca82cb5bc2a09970b06f2a29bde443aba73"} Oct 14 13:37:44.018363 master-2 kubenswrapper[4762]: I1014 13:37:44.018333 4762 generic.go:334] "Generic (PLEG): container finished" podID="17c406ad-d002-46f3-9014-31258d21a113" containerID="683928058a90ab666720fc7169f961fb3000034765998957ace8e1708bfb207b" exitCode=0 Oct 14 13:37:44.018363 master-2 kubenswrapper[4762]: I1014 13:37:44.018361 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3163-account-create-lprsc" event={"ID":"17c406ad-d002-46f3-9014-31258d21a113","Type":"ContainerDied","Data":"683928058a90ab666720fc7169f961fb3000034765998957ace8e1708bfb207b"} Oct 14 13:37:44.053227 master-2 kubenswrapper[4762]: I1014 13:37:44.051801 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:44.078200 master-2 kubenswrapper[4762]: I1014 13:37:44.076549 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:44.180194 master-2 kubenswrapper[4762]: I1014 13:37:44.179451 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:44.184815 master-2 kubenswrapper[4762]: I1014 13:37:44.184744 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.189181 master-2 kubenswrapper[4762]: I1014 13:37:44.188560 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 14 13:37:44.192709 master-2 kubenswrapper[4762]: I1014 13:37:44.192664 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-46645-default-external-config-data" Oct 14 13:37:44.200583 master-2 kubenswrapper[4762]: I1014 13:37:44.200056 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:44.266921 master-2 kubenswrapper[4762]: I1014 13:37:44.266818 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-46645-api-0" podStartSLOduration=8.266793246 podStartE2EDuration="8.266793246s" podCreationTimestamp="2025-10-14 13:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:37:44.227031476 +0000 UTC m=+1893.471190655" watchObservedRunningTime="2025-10-14 13:37:44.266793246 +0000 UTC m=+1893.510952405" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278036 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278120 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278199 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278262 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-public-tls-certs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278284 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278469 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.280305 master-2 kubenswrapper[4762]: I1014 13:37:44.278637 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242wk\" (UniqueName: \"kubernetes.io/projected/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-kube-api-access-242wk\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.386724 master-2 kubenswrapper[4762]: I1014 13:37:44.386603 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242wk\" (UniqueName: \"kubernetes.io/projected/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-kube-api-access-242wk\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.386762 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.386827 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.386924 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.386979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.387013 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-public-tls-certs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.387060 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.387222 master-2 kubenswrapper[4762]: I1014 13:37:44.387202 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.389440 master-2 kubenswrapper[4762]: I1014 13:37:44.389390 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-httpd-run\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.390235 master-2 kubenswrapper[4762]: I1014 13:37:44.390137 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-logs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.398638 master-2 kubenswrapper[4762]: I1014 13:37:44.393925 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-combined-ca-bundle\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.398638 master-2 kubenswrapper[4762]: I1014 13:37:44.394105 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-scripts\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.398638 master-2 kubenswrapper[4762]: I1014 13:37:44.394418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-config-data\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.403161 master-2 kubenswrapper[4762]: I1014 13:37:44.403096 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:37:44.403257 master-2 kubenswrapper[4762]: I1014 13:37:44.403178 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/5685f65b7fddc01b30e41430caf46764761af798608234f6a998350abdb8ec95/globalmount\"" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.403388 master-2 kubenswrapper[4762]: I1014 13:37:44.403289 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3359-account-create-c7gkw"] Oct 14 13:37:44.404228 master-2 kubenswrapper[4762]: I1014 13:37:44.404187 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-public-tls-certs\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:44.442498 master-2 kubenswrapper[4762]: I1014 13:37:44.442439 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242wk\" (UniqueName: \"kubernetes.io/projected/fd1659e3-6fe6-4c3a-823a-cfbfff68e152-kube-api-access-242wk\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:45.087434 master-2 kubenswrapper[4762]: I1014 13:37:45.085616 4762 generic.go:334] "Generic (PLEG): container finished" podID="b20eee57-43a0-48ae-be88-57727baa5e8d" containerID="fe99be248c3c62595be624f8732456079e585271421a6c07660d518d9747d017" exitCode=0 Oct 14 13:37:45.087434 master-2 kubenswrapper[4762]: I1014 13:37:45.086322 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3359-account-create-c7gkw" event={"ID":"b20eee57-43a0-48ae-be88-57727baa5e8d","Type":"ContainerDied","Data":"fe99be248c3c62595be624f8732456079e585271421a6c07660d518d9747d017"} Oct 14 13:37:45.087434 master-2 kubenswrapper[4762]: I1014 13:37:45.086450 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3359-account-create-c7gkw" event={"ID":"b20eee57-43a0-48ae-be88-57727baa5e8d","Type":"ContainerStarted","Data":"581a32a3df76ba3e1d65cae3f5d5c20f0c65718e48e33efe52d64a827fff080d"} Oct 14 13:37:45.321327 master-2 kubenswrapper[4762]: I1014 13:37:45.321263 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a7796025-99aa-4d01-b91a-fcce6dec3c73\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ae0639de-ce3d-4a5d-9e93-e76a0a1363d1\") pod \"glance-46645-default-external-api-1\" (UID: \"fd1659e3-6fe6-4c3a-823a-cfbfff68e152\") " pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:45.557953 master-2 kubenswrapper[4762]: I1014 13:37:45.557891 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1a658e-0a8c-416a-9624-f80c6fbacde7" path="/var/lib/kubelet/pods/bd1a658e-0a8c-416a-9624-f80c6fbacde7/volumes" Oct 14 13:37:45.627740 master-2 kubenswrapper[4762]: I1014 13:37:45.627646 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:45.712775 master-2 kubenswrapper[4762]: I1014 13:37:45.712726 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:45.727724 master-2 kubenswrapper[4762]: I1014 13:37:45.727652 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czz6v\" (UniqueName: \"kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v\") pod \"17c406ad-d002-46f3-9014-31258d21a113\" (UID: \"17c406ad-d002-46f3-9014-31258d21a113\") " Oct 14 13:37:45.739608 master-2 kubenswrapper[4762]: I1014 13:37:45.738952 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v" (OuterVolumeSpecName: "kube-api-access-czz6v") pod "17c406ad-d002-46f3-9014-31258d21a113" (UID: "17c406ad-d002-46f3-9014-31258d21a113"). InnerVolumeSpecName "kube-api-access-czz6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:45.833808 master-2 kubenswrapper[4762]: I1014 13:37:45.833757 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czz6v\" (UniqueName: \"kubernetes.io/projected/17c406ad-d002-46f3-9014-31258d21a113-kube-api-access-czz6v\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:45.872219 master-2 kubenswrapper[4762]: I1014 13:37:45.872085 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:46.112264 master-2 kubenswrapper[4762]: I1014 13:37:46.112120 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerStarted","Data":"36c47e632d61c3a48f046cc6fe8aa7ae91c747ed05069236a8bac29d4d32a155"} Oct 14 13:37:46.112773 master-2 kubenswrapper[4762]: I1014 13:37:46.112619 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-central-agent" containerID="cri-o://15d34e6e2593e74b02cb16e87c18fef2ec406d98aeebcf4017b39cb8ad0093a4" gracePeriod=30 Oct 14 13:37:46.112808 master-2 kubenswrapper[4762]: I1014 13:37:46.112772 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="proxy-httpd" containerID="cri-o://36c47e632d61c3a48f046cc6fe8aa7ae91c747ed05069236a8bac29d4d32a155" gracePeriod=30 Oct 14 13:37:46.112849 master-2 kubenswrapper[4762]: I1014 13:37:46.112839 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="sg-core" containerID="cri-o://a1c150c934280ccddcfb786a87918482550e6fd376a2add163d6f263083ba00f" gracePeriod=30 Oct 14 13:37:46.113041 master-2 kubenswrapper[4762]: I1014 13:37:46.112885 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-notification-agent" containerID="cri-o://98002b6c7dc3c9fed58f7c1e1ee7bca82cb5bc2a09970b06f2a29bde443aba73" gracePeriod=30 Oct 14 13:37:46.113041 master-2 kubenswrapper[4762]: I1014 13:37:46.112685 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:37:46.122343 master-2 kubenswrapper[4762]: I1014 13:37:46.117595 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3163-account-create-lprsc" Oct 14 13:37:46.122343 master-2 kubenswrapper[4762]: I1014 13:37:46.118016 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3163-account-create-lprsc" event={"ID":"17c406ad-d002-46f3-9014-31258d21a113","Type":"ContainerDied","Data":"a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147"} Oct 14 13:37:46.122343 master-2 kubenswrapper[4762]: I1014 13:37:46.119101 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ef5f6144b7fa59bb3024af708c7624b67c34c64c5674789c9eb81cf0952147" Oct 14 13:37:46.163238 master-2 kubenswrapper[4762]: I1014 13:37:46.162646 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.373624064 podStartE2EDuration="8.162621351s" podCreationTimestamp="2025-10-14 13:37:38 +0000 UTC" firstStartedPulling="2025-10-14 13:37:41.889395827 +0000 UTC m=+1891.133554986" lastFinishedPulling="2025-10-14 13:37:45.678393104 +0000 UTC m=+1894.922552273" observedRunningTime="2025-10-14 13:37:46.152102253 +0000 UTC m=+1895.396261412" watchObservedRunningTime="2025-10-14 13:37:46.162621351 +0000 UTC m=+1895.406780510" Oct 14 13:37:46.425203 master-2 kubenswrapper[4762]: I1014 13:37:46.423600 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-external-api-1"] Oct 14 13:37:46.428178 master-2 kubenswrapper[4762]: W1014 13:37:46.427292 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd1659e3_6fe6_4c3a_823a_cfbfff68e152.slice/crio-58ef4f5b96d0fbd59d976648edfe1884a0ffcc6abaec6709bc2a5a04af55715a WatchSource:0}: Error finding container 58ef4f5b96d0fbd59d976648edfe1884a0ffcc6abaec6709bc2a5a04af55715a: Status 404 returned error can't find the container with id 58ef4f5b96d0fbd59d976648edfe1884a0ffcc6abaec6709bc2a5a04af55715a Oct 14 13:37:46.595757 master-2 kubenswrapper[4762]: I1014 13:37:46.595197 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:46.777692 master-2 kubenswrapper[4762]: I1014 13:37:46.777633 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfnmx\" (UniqueName: \"kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx\") pod \"b20eee57-43a0-48ae-be88-57727baa5e8d\" (UID: \"b20eee57-43a0-48ae-be88-57727baa5e8d\") " Oct 14 13:37:46.780846 master-2 kubenswrapper[4762]: I1014 13:37:46.780797 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx" (OuterVolumeSpecName: "kube-api-access-jfnmx") pod "b20eee57-43a0-48ae-be88-57727baa5e8d" (UID: "b20eee57-43a0-48ae-be88-57727baa5e8d"). InnerVolumeSpecName "kube-api-access-jfnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:46.879650 master-2 kubenswrapper[4762]: I1014 13:37:46.879580 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfnmx\" (UniqueName: \"kubernetes.io/projected/b20eee57-43a0-48ae-be88-57727baa5e8d-kube-api-access-jfnmx\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:47.130782 master-2 kubenswrapper[4762]: I1014 13:37:47.130679 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3359-account-create-c7gkw" event={"ID":"b20eee57-43a0-48ae-be88-57727baa5e8d","Type":"ContainerDied","Data":"581a32a3df76ba3e1d65cae3f5d5c20f0c65718e48e33efe52d64a827fff080d"} Oct 14 13:37:47.131774 master-2 kubenswrapper[4762]: I1014 13:37:47.131107 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581a32a3df76ba3e1d65cae3f5d5c20f0c65718e48e33efe52d64a827fff080d" Oct 14 13:37:47.131774 master-2 kubenswrapper[4762]: I1014 13:37:47.130768 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3359-account-create-c7gkw" Oct 14 13:37:47.136379 master-2 kubenswrapper[4762]: I1014 13:37:47.136298 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"fd1659e3-6fe6-4c3a-823a-cfbfff68e152","Type":"ContainerStarted","Data":"96cd04181d67199cb68f4aa2f08e878f9e04eb862ab4ff624eee4401c8b3549c"} Oct 14 13:37:47.136379 master-2 kubenswrapper[4762]: I1014 13:37:47.136367 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"fd1659e3-6fe6-4c3a-823a-cfbfff68e152","Type":"ContainerStarted","Data":"58ef4f5b96d0fbd59d976648edfe1884a0ffcc6abaec6709bc2a5a04af55715a"} Oct 14 13:37:47.142451 master-2 kubenswrapper[4762]: I1014 13:37:47.142389 4762 generic.go:334] "Generic (PLEG): container finished" podID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerID="36c47e632d61c3a48f046cc6fe8aa7ae91c747ed05069236a8bac29d4d32a155" exitCode=0 Oct 14 13:37:47.142451 master-2 kubenswrapper[4762]: I1014 13:37:47.142427 4762 generic.go:334] "Generic (PLEG): container finished" podID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerID="a1c150c934280ccddcfb786a87918482550e6fd376a2add163d6f263083ba00f" exitCode=2 Oct 14 13:37:47.142451 master-2 kubenswrapper[4762]: I1014 13:37:47.142459 4762 generic.go:334] "Generic (PLEG): container finished" podID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerID="98002b6c7dc3c9fed58f7c1e1ee7bca82cb5bc2a09970b06f2a29bde443aba73" exitCode=0 Oct 14 13:37:47.142785 master-2 kubenswrapper[4762]: I1014 13:37:47.142457 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerDied","Data":"36c47e632d61c3a48f046cc6fe8aa7ae91c747ed05069236a8bac29d4d32a155"} Oct 14 13:37:47.142785 master-2 kubenswrapper[4762]: I1014 13:37:47.142502 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerDied","Data":"a1c150c934280ccddcfb786a87918482550e6fd376a2add163d6f263083ba00f"} Oct 14 13:37:47.142785 master-2 kubenswrapper[4762]: I1014 13:37:47.142514 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerDied","Data":"98002b6c7dc3c9fed58f7c1e1ee7bca82cb5bc2a09970b06f2a29bde443aba73"} Oct 14 13:37:48.160184 master-2 kubenswrapper[4762]: I1014 13:37:48.160109 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-external-api-1" event={"ID":"fd1659e3-6fe6-4c3a-823a-cfbfff68e152","Type":"ContainerStarted","Data":"a215704959c440a1e14184270709918e5aa5ef90feb0c633355872d23d4b6b13"} Oct 14 13:37:48.199011 master-2 kubenswrapper[4762]: I1014 13:37:48.198917 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-46645-default-external-api-1" podStartSLOduration=4.198897593 podStartE2EDuration="4.198897593s" podCreationTimestamp="2025-10-14 13:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:37:48.198476391 +0000 UTC m=+1897.442635550" watchObservedRunningTime="2025-10-14 13:37:48.198897593 +0000 UTC m=+1897.443056762" Oct 14 13:37:48.334399 master-2 kubenswrapper[4762]: I1014 13:37:48.334345 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shgdv"] Oct 14 13:37:48.334729 master-2 kubenswrapper[4762]: E1014 13:37:48.334703 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20eee57-43a0-48ae-be88-57727baa5e8d" containerName="mariadb-account-create" Oct 14 13:37:48.334729 master-2 kubenswrapper[4762]: I1014 13:37:48.334721 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20eee57-43a0-48ae-be88-57727baa5e8d" containerName="mariadb-account-create" Oct 14 13:37:48.334801 master-2 kubenswrapper[4762]: E1014 13:37:48.334734 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c406ad-d002-46f3-9014-31258d21a113" containerName="mariadb-account-create" Oct 14 13:37:48.334801 master-2 kubenswrapper[4762]: I1014 13:37:48.334741 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c406ad-d002-46f3-9014-31258d21a113" containerName="mariadb-account-create" Oct 14 13:37:48.334952 master-2 kubenswrapper[4762]: I1014 13:37:48.334891 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c406ad-d002-46f3-9014-31258d21a113" containerName="mariadb-account-create" Oct 14 13:37:48.334952 master-2 kubenswrapper[4762]: I1014 13:37:48.334910 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20eee57-43a0-48ae-be88-57727baa5e8d" containerName="mariadb-account-create" Oct 14 13:37:48.335612 master-2 kubenswrapper[4762]: I1014 13:37:48.335585 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.338553 master-2 kubenswrapper[4762]: I1014 13:37:48.338521 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 14 13:37:48.338880 master-2 kubenswrapper[4762]: I1014 13:37:48.338810 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 14 13:37:48.361559 master-2 kubenswrapper[4762]: I1014 13:37:48.361352 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shgdv"] Oct 14 13:37:48.516357 master-2 kubenswrapper[4762]: I1014 13:37:48.516094 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.516770 master-2 kubenswrapper[4762]: I1014 13:37:48.516489 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5x5l\" (UniqueName: \"kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.516770 master-2 kubenswrapper[4762]: I1014 13:37:48.516549 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.516770 master-2 kubenswrapper[4762]: I1014 13:37:48.516674 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.617760 master-2 kubenswrapper[4762]: I1014 13:37:48.617710 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.618013 master-2 kubenswrapper[4762]: I1014 13:37:48.617766 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.618013 master-2 kubenswrapper[4762]: I1014 13:37:48.617853 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5x5l\" (UniqueName: \"kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.618013 master-2 kubenswrapper[4762]: I1014 13:37:48.617875 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.621071 master-2 kubenswrapper[4762]: I1014 13:37:48.620969 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.621743 master-2 kubenswrapper[4762]: I1014 13:37:48.621374 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.633816 master-2 kubenswrapper[4762]: I1014 13:37:48.633778 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.640751 master-2 kubenswrapper[4762]: I1014 13:37:48.640718 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5x5l\" (UniqueName: \"kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l\") pod \"nova-cell0-conductor-db-sync-shgdv\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:48.658524 master-2 kubenswrapper[4762]: I1014 13:37:48.658436 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:37:49.135799 master-2 kubenswrapper[4762]: I1014 13:37:49.135685 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shgdv"] Oct 14 13:37:49.141902 master-2 kubenswrapper[4762]: W1014 13:37:49.141364 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ecfc00_c91d_4ca3_93af_8897072115cc.slice/crio-177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062 WatchSource:0}: Error finding container 177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062: Status 404 returned error can't find the container with id 177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062 Oct 14 13:37:49.171021 master-2 kubenswrapper[4762]: I1014 13:37:49.170953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shgdv" event={"ID":"41ecfc00-c91d-4ca3-93af-8897072115cc","Type":"ContainerStarted","Data":"177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062"} Oct 14 13:37:51.790702 master-2 kubenswrapper[4762]: I1014 13:37:51.790632 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-46645-api-0" Oct 14 13:37:52.201023 master-2 kubenswrapper[4762]: I1014 13:37:52.200956 4762 generic.go:334] "Generic (PLEG): container finished" podID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerID="15d34e6e2593e74b02cb16e87c18fef2ec406d98aeebcf4017b39cb8ad0093a4" exitCode=0 Oct 14 13:37:52.201023 master-2 kubenswrapper[4762]: I1014 13:37:52.201005 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerDied","Data":"15d34e6e2593e74b02cb16e87c18fef2ec406d98aeebcf4017b39cb8ad0093a4"} Oct 14 13:37:55.713501 master-2 kubenswrapper[4762]: I1014 13:37:55.713445 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:55.714144 master-2 kubenswrapper[4762]: I1014 13:37:55.714130 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:55.745026 master-2 kubenswrapper[4762]: I1014 13:37:55.742474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:55.762044 master-2 kubenswrapper[4762]: I1014 13:37:55.761975 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:56.231200 master-2 kubenswrapper[4762]: I1014 13:37:56.231071 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:56.231200 master-2 kubenswrapper[4762]: I1014 13:37:56.231133 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:56.346128 master-2 kubenswrapper[4762]: I1014 13:37:56.346027 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:56.414037 master-2 kubenswrapper[4762]: I1014 13:37:56.413894 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414037 master-2 kubenswrapper[4762]: I1014 13:37:56.413983 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414479 master-2 kubenswrapper[4762]: I1014 13:37:56.414075 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414479 master-2 kubenswrapper[4762]: I1014 13:37:56.414148 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414479 master-2 kubenswrapper[4762]: I1014 13:37:56.414208 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414479 master-2 kubenswrapper[4762]: I1014 13:37:56.414238 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx2jf\" (UniqueName: \"kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.414973 master-2 kubenswrapper[4762]: I1014 13:37:56.414859 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:56.415816 master-2 kubenswrapper[4762]: I1014 13:37:56.415738 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:37:56.415922 master-2 kubenswrapper[4762]: I1014 13:37:56.415798 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts\") pod \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\" (UID: \"3590b6a1-d0c1-443b-9dcf-6c5130450a96\") " Oct 14 13:37:56.417377 master-2 kubenswrapper[4762]: I1014 13:37:56.417300 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.417499 master-2 kubenswrapper[4762]: I1014 13:37:56.417375 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3590b6a1-d0c1-443b-9dcf-6c5130450a96-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.418831 master-2 kubenswrapper[4762]: I1014 13:37:56.418737 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts" (OuterVolumeSpecName: "scripts") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:56.419180 master-2 kubenswrapper[4762]: I1014 13:37:56.419066 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf" (OuterVolumeSpecName: "kube-api-access-qx2jf") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "kube-api-access-qx2jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:37:56.441523 master-2 kubenswrapper[4762]: I1014 13:37:56.441307 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:56.487284 master-2 kubenswrapper[4762]: I1014 13:37:56.486920 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:56.519002 master-2 kubenswrapper[4762]: I1014 13:37:56.518938 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.519002 master-2 kubenswrapper[4762]: I1014 13:37:56.518976 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.519002 master-2 kubenswrapper[4762]: I1014 13:37:56.518987 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx2jf\" (UniqueName: \"kubernetes.io/projected/3590b6a1-d0c1-443b-9dcf-6c5130450a96-kube-api-access-qx2jf\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.519002 master-2 kubenswrapper[4762]: I1014 13:37:56.518995 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:56.521688 master-2 kubenswrapper[4762]: I1014 13:37:56.521570 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data" (OuterVolumeSpecName: "config-data") pod "3590b6a1-d0c1-443b-9dcf-6c5130450a96" (UID: "3590b6a1-d0c1-443b-9dcf-6c5130450a96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:37:56.621123 master-2 kubenswrapper[4762]: I1014 13:37:56.621053 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3590b6a1-d0c1-443b-9dcf-6c5130450a96-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:37:57.242113 master-2 kubenswrapper[4762]: I1014 13:37:57.242041 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3590b6a1-d0c1-443b-9dcf-6c5130450a96","Type":"ContainerDied","Data":"32efb2047f017b262faca1e25c3807e2e32b7017ce682151a0c48cce798795a3"} Oct 14 13:37:57.242732 master-2 kubenswrapper[4762]: I1014 13:37:57.242124 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:57.242732 master-2 kubenswrapper[4762]: I1014 13:37:57.242134 4762 scope.go:117] "RemoveContainer" containerID="36c47e632d61c3a48f046cc6fe8aa7ae91c747ed05069236a8bac29d4d32a155" Oct 14 13:37:57.244897 master-2 kubenswrapper[4762]: I1014 13:37:57.244819 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shgdv" event={"ID":"41ecfc00-c91d-4ca3-93af-8897072115cc","Type":"ContainerStarted","Data":"9ec45cda7dc1b8ec193a280be686462387e731fd2aca13027de3b37f908dbcb8"} Oct 14 13:37:57.276769 master-2 kubenswrapper[4762]: I1014 13:37:57.276138 4762 scope.go:117] "RemoveContainer" containerID="a1c150c934280ccddcfb786a87918482550e6fd376a2add163d6f263083ba00f" Oct 14 13:37:57.302275 master-2 kubenswrapper[4762]: I1014 13:37:57.301990 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-shgdv" podStartSLOduration=2.067406806 podStartE2EDuration="9.301945511s" podCreationTimestamp="2025-10-14 13:37:48 +0000 UTC" firstStartedPulling="2025-10-14 13:37:49.144339319 +0000 UTC m=+1898.388498478" lastFinishedPulling="2025-10-14 13:37:56.378878024 +0000 UTC m=+1905.623037183" observedRunningTime="2025-10-14 13:37:57.290400021 +0000 UTC m=+1906.534559180" watchObservedRunningTime="2025-10-14 13:37:57.301945511 +0000 UTC m=+1906.546104670" Oct 14 13:37:57.319922 master-2 kubenswrapper[4762]: I1014 13:37:57.319879 4762 scope.go:117] "RemoveContainer" containerID="98002b6c7dc3c9fed58f7c1e1ee7bca82cb5bc2a09970b06f2a29bde443aba73" Oct 14 13:37:57.346537 master-2 kubenswrapper[4762]: I1014 13:37:57.346442 4762 scope.go:117] "RemoveContainer" containerID="15d34e6e2593e74b02cb16e87c18fef2ec406d98aeebcf4017b39cb8ad0093a4" Oct 14 13:37:57.740201 master-2 kubenswrapper[4762]: I1014 13:37:57.740110 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:57.748489 master-2 kubenswrapper[4762]: I1014 13:37:57.748412 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:57.816620 master-2 kubenswrapper[4762]: I1014 13:37:57.816564 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:57.816877 master-2 kubenswrapper[4762]: E1014 13:37:57.816851 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="sg-core" Oct 14 13:37:57.816877 master-2 kubenswrapper[4762]: I1014 13:37:57.816870 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="sg-core" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: E1014 13:37:57.816899 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-notification-agent" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: I1014 13:37:57.816907 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-notification-agent" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: E1014 13:37:57.816916 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-central-agent" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: I1014 13:37:57.816923 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-central-agent" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: E1014 13:37:57.816946 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="proxy-httpd" Oct 14 13:37:57.817009 master-2 kubenswrapper[4762]: I1014 13:37:57.816952 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="proxy-httpd" Oct 14 13:37:57.817317 master-2 kubenswrapper[4762]: I1014 13:37:57.817070 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-notification-agent" Oct 14 13:37:57.817317 master-2 kubenswrapper[4762]: I1014 13:37:57.817087 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="proxy-httpd" Oct 14 13:37:57.817317 master-2 kubenswrapper[4762]: I1014 13:37:57.817098 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="sg-core" Oct 14 13:37:57.817317 master-2 kubenswrapper[4762]: I1014 13:37:57.817108 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" containerName="ceilometer-central-agent" Oct 14 13:37:57.820864 master-2 kubenswrapper[4762]: I1014 13:37:57.820836 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:57.823795 master-2 kubenswrapper[4762]: I1014 13:37:57.823769 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:37:57.824008 master-2 kubenswrapper[4762]: I1014 13:37:57.823977 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:37:57.845344 master-2 kubenswrapper[4762]: I1014 13:37:57.845296 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:57.949526 master-2 kubenswrapper[4762]: I1014 13:37:57.949428 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj2md\" (UniqueName: \"kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.949526 master-2 kubenswrapper[4762]: I1014 13:37:57.949489 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.949526 master-2 kubenswrapper[4762]: I1014 13:37:57.949532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.949526 master-2 kubenswrapper[4762]: I1014 13:37:57.949562 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.950242 master-2 kubenswrapper[4762]: I1014 13:37:57.949586 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.950242 master-2 kubenswrapper[4762]: I1014 13:37:57.949638 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:57.950242 master-2 kubenswrapper[4762]: I1014 13:37:57.949677 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.052448 master-2 kubenswrapper[4762]: I1014 13:37:58.052186 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.052448 master-2 kubenswrapper[4762]: I1014 13:37:58.052278 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.052448 master-2 kubenswrapper[4762]: I1014 13:37:58.052306 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.053789 master-2 kubenswrapper[4762]: I1014 13:37:58.053031 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.053789 master-2 kubenswrapper[4762]: I1014 13:37:58.053114 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.053789 master-2 kubenswrapper[4762]: I1014 13:37:58.053245 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.053789 master-2 kubenswrapper[4762]: I1014 13:37:58.053690 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.054004 master-2 kubenswrapper[4762]: I1014 13:37:58.053831 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj2md\" (UniqueName: \"kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.054004 master-2 kubenswrapper[4762]: I1014 13:37:58.053863 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.058803 master-2 kubenswrapper[4762]: I1014 13:37:58.058740 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.064176 master-2 kubenswrapper[4762]: I1014 13:37:58.061974 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.064176 master-2 kubenswrapper[4762]: I1014 13:37:58.062022 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.074408 master-2 kubenswrapper[4762]: I1014 13:37:58.074350 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.078008 master-2 kubenswrapper[4762]: I1014 13:37:58.077942 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj2md\" (UniqueName: \"kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md\") pod \"ceilometer-0\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " pod="openstack/ceilometer-0" Oct 14 13:37:58.156267 master-2 kubenswrapper[4762]: I1014 13:37:58.155279 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:37:58.309218 master-2 kubenswrapper[4762]: I1014 13:37:58.308508 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:58.309218 master-2 kubenswrapper[4762]: I1014 13:37:58.308614 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:37:58.361205 master-2 kubenswrapper[4762]: I1014 13:37:58.361030 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-external-api-1" Oct 14 13:37:58.603400 master-2 kubenswrapper[4762]: I1014 13:37:58.599662 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:37:59.273792 master-2 kubenswrapper[4762]: I1014 13:37:59.273711 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerStarted","Data":"f967381e3fa964e9c65cc88275d8bf2fbc1dd80bcad5aeaeb93eb4070078215d"} Oct 14 13:37:59.559748 master-2 kubenswrapper[4762]: I1014 13:37:59.559613 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3590b6a1-d0c1-443b-9dcf-6c5130450a96" path="/var/lib/kubelet/pods/3590b6a1-d0c1-443b-9dcf-6c5130450a96/volumes" Oct 14 13:38:00.034379 master-2 kubenswrapper[4762]: I1014 13:38:00.034307 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:00.036335 master-2 kubenswrapper[4762]: I1014 13:38:00.035724 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-46645-default-internal-api-0" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-log" containerID="cri-o://86ab3cecfb7678393acadd9dd9fad69de6aa001061b1d2c229088112bafb8fb3" gracePeriod=30 Oct 14 13:38:00.036514 master-2 kubenswrapper[4762]: I1014 13:38:00.036461 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-46645-default-internal-api-0" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-httpd" containerID="cri-o://7cb560a1b4d4a005a2217fdb0d36facc44578a3519e2d034432c2066e64b748a" gracePeriod=30 Oct 14 13:38:00.285985 master-2 kubenswrapper[4762]: I1014 13:38:00.285831 4762 generic.go:334] "Generic (PLEG): container finished" podID="cbc23c12-5074-4846-8975-0ce11de825bc" containerID="86ab3cecfb7678393acadd9dd9fad69de6aa001061b1d2c229088112bafb8fb3" exitCode=143 Oct 14 13:38:00.285985 master-2 kubenswrapper[4762]: I1014 13:38:00.285917 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerDied","Data":"86ab3cecfb7678393acadd9dd9fad69de6aa001061b1d2c229088112bafb8fb3"} Oct 14 13:38:00.288526 master-2 kubenswrapper[4762]: I1014 13:38:00.288482 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerStarted","Data":"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75"} Oct 14 13:38:01.313893 master-2 kubenswrapper[4762]: I1014 13:38:01.313821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerStarted","Data":"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5"} Oct 14 13:38:02.324921 master-2 kubenswrapper[4762]: I1014 13:38:02.324869 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerStarted","Data":"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda"} Oct 14 13:38:03.339200 master-2 kubenswrapper[4762]: I1014 13:38:03.339115 4762 generic.go:334] "Generic (PLEG): container finished" podID="cbc23c12-5074-4846-8975-0ce11de825bc" containerID="7cb560a1b4d4a005a2217fdb0d36facc44578a3519e2d034432c2066e64b748a" exitCode=0 Oct 14 13:38:03.339200 master-2 kubenswrapper[4762]: I1014 13:38:03.339189 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerDied","Data":"7cb560a1b4d4a005a2217fdb0d36facc44578a3519e2d034432c2066e64b748a"} Oct 14 13:38:03.872037 master-2 kubenswrapper[4762]: I1014 13:38:03.871049 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:03.989590 master-2 kubenswrapper[4762]: I1014 13:38:03.989558 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.989919 master-2 kubenswrapper[4762]: I1014 13:38:03.989904 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.990123 master-2 kubenswrapper[4762]: I1014 13:38:03.990103 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.990308 master-2 kubenswrapper[4762]: I1014 13:38:03.990210 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs" (OuterVolumeSpecName: "logs") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:03.990373 master-2 kubenswrapper[4762]: I1014 13:38:03.990235 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:03.990431 master-2 kubenswrapper[4762]: I1014 13:38:03.990416 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.990625 master-2 kubenswrapper[4762]: I1014 13:38:03.990612 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.990923 master-2 kubenswrapper[4762]: I1014 13:38:03.990908 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckq9\" (UniqueName: \"kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.992068 master-2 kubenswrapper[4762]: I1014 13:38:03.991010 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts\") pod \"cbc23c12-5074-4846-8975-0ce11de825bc\" (UID: \"cbc23c12-5074-4846-8975-0ce11de825bc\") " Oct 14 13:38:03.992668 master-2 kubenswrapper[4762]: I1014 13:38:03.992647 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:03.992779 master-2 kubenswrapper[4762]: I1014 13:38:03.992763 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc23c12-5074-4846-8975-0ce11de825bc-httpd-run\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:03.994469 master-2 kubenswrapper[4762]: I1014 13:38:03.994315 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts" (OuterVolumeSpecName: "scripts") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:03.994469 master-2 kubenswrapper[4762]: I1014 13:38:03.994364 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9" (OuterVolumeSpecName: "kube-api-access-hckq9") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "kube-api-access-hckq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:04.009707 master-2 kubenswrapper[4762]: I1014 13:38:04.009628 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:04.015893 master-2 kubenswrapper[4762]: I1014 13:38:04.015828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e" (OuterVolumeSpecName: "glance") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 14 13:38:04.031133 master-2 kubenswrapper[4762]: I1014 13:38:04.031036 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data" (OuterVolumeSpecName: "config-data") pod "cbc23c12-5074-4846-8975-0ce11de825bc" (UID: "cbc23c12-5074-4846-8975-0ce11de825bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:04.094654 master-2 kubenswrapper[4762]: I1014 13:38:04.094265 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:04.094654 master-2 kubenswrapper[4762]: I1014 13:38:04.094350 4762 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") on node \"master-2\" " Oct 14 13:38:04.094654 master-2 kubenswrapper[4762]: I1014 13:38:04.094364 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hckq9\" (UniqueName: \"kubernetes.io/projected/cbc23c12-5074-4846-8975-0ce11de825bc-kube-api-access-hckq9\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:04.094654 master-2 kubenswrapper[4762]: I1014 13:38:04.094375 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:04.094654 master-2 kubenswrapper[4762]: I1014 13:38:04.094384 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc23c12-5074-4846-8975-0ce11de825bc-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:04.125018 master-2 kubenswrapper[4762]: I1014 13:38:04.124957 4762 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 14 13:38:04.125312 master-2 kubenswrapper[4762]: I1014 13:38:04.125148 4762 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2" (UniqueName: "kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e") on node "master-2" Oct 14 13:38:04.196178 master-2 kubenswrapper[4762]: I1014 13:38:04.196106 4762 reconciler_common.go:293] "Volume detached for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:04.351401 master-2 kubenswrapper[4762]: I1014 13:38:04.351286 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerStarted","Data":"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0"} Oct 14 13:38:04.351916 master-2 kubenswrapper[4762]: I1014 13:38:04.351474 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:38:04.356833 master-2 kubenswrapper[4762]: I1014 13:38:04.356746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"cbc23c12-5074-4846-8975-0ce11de825bc","Type":"ContainerDied","Data":"a119af22578a27f48f0e5e69fbb27f307abba8bfbe24e1d079868c88df48bc05"} Oct 14 13:38:04.356938 master-2 kubenswrapper[4762]: I1014 13:38:04.356849 4762 scope.go:117] "RemoveContainer" containerID="7cb560a1b4d4a005a2217fdb0d36facc44578a3519e2d034432c2066e64b748a" Oct 14 13:38:04.357120 master-2 kubenswrapper[4762]: I1014 13:38:04.357083 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.389600 master-2 kubenswrapper[4762]: I1014 13:38:04.389491 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.407584178 podStartE2EDuration="7.389459483s" podCreationTimestamp="2025-10-14 13:37:57 +0000 UTC" firstStartedPulling="2025-10-14 13:37:58.615265366 +0000 UTC m=+1907.859424525" lastFinishedPulling="2025-10-14 13:38:03.597140671 +0000 UTC m=+1912.841299830" observedRunningTime="2025-10-14 13:38:04.384480258 +0000 UTC m=+1913.628639417" watchObservedRunningTime="2025-10-14 13:38:04.389459483 +0000 UTC m=+1913.633618642" Oct 14 13:38:04.402561 master-2 kubenswrapper[4762]: I1014 13:38:04.402499 4762 scope.go:117] "RemoveContainer" containerID="86ab3cecfb7678393acadd9dd9fad69de6aa001061b1d2c229088112bafb8fb3" Oct 14 13:38:04.428130 master-2 kubenswrapper[4762]: I1014 13:38:04.428059 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:04.435125 master-2 kubenswrapper[4762]: I1014 13:38:04.434326 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:04.455883 master-2 kubenswrapper[4762]: I1014 13:38:04.455821 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: E1014 13:38:04.456327 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-httpd" Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: I1014 13:38:04.456366 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-httpd" Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: E1014 13:38:04.456381 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-log" Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: I1014 13:38:04.456388 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-log" Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: I1014 13:38:04.456516 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-log" Oct 14 13:38:04.456700 master-2 kubenswrapper[4762]: I1014 13:38:04.456526 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" containerName="glance-httpd" Oct 14 13:38:04.457674 master-2 kubenswrapper[4762]: I1014 13:38:04.457642 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.460928 master-2 kubenswrapper[4762]: I1014 13:38:04.460581 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 14 13:38:04.461022 master-2 kubenswrapper[4762]: I1014 13:38:04.460987 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-46645-default-internal-config-data" Oct 14 13:38:04.479972 master-2 kubenswrapper[4762]: I1014 13:38:04.479628 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:04.616975 master-2 kubenswrapper[4762]: I1014 13:38:04.616745 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.616975 master-2 kubenswrapper[4762]: I1014 13:38:04.616959 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6r8\" (UniqueName: \"kubernetes.io/projected/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-kube-api-access-kj6r8\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617412 master-2 kubenswrapper[4762]: I1014 13:38:04.617062 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617412 master-2 kubenswrapper[4762]: I1014 13:38:04.617186 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617412 master-2 kubenswrapper[4762]: I1014 13:38:04.617223 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-internal-tls-certs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617532 master-2 kubenswrapper[4762]: I1014 13:38:04.617430 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617532 master-2 kubenswrapper[4762]: I1014 13:38:04.617473 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.617652 master-2 kubenswrapper[4762]: I1014 13:38:04.617622 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.718913 master-2 kubenswrapper[4762]: I1014 13:38:04.718824 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.718913 master-2 kubenswrapper[4762]: I1014 13:38:04.718911 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6r8\" (UniqueName: \"kubernetes.io/projected/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-kube-api-access-kj6r8\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.718913 master-2 kubenswrapper[4762]: I1014 13:38:04.718945 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.719421 master-2 kubenswrapper[4762]: I1014 13:38:04.718971 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.719421 master-2 kubenswrapper[4762]: I1014 13:38:04.718986 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-internal-tls-certs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.719421 master-2 kubenswrapper[4762]: I1014 13:38:04.719038 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.719421 master-2 kubenswrapper[4762]: I1014 13:38:04.719061 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.719421 master-2 kubenswrapper[4762]: I1014 13:38:04.719102 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.720013 master-2 kubenswrapper[4762]: I1014 13:38:04.719976 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-httpd-run\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.722286 master-2 kubenswrapper[4762]: I1014 13:38:04.722245 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-combined-ca-bundle\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.722618 master-2 kubenswrapper[4762]: I1014 13:38:04.722569 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-logs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.722951 master-2 kubenswrapper[4762]: I1014 13:38:04.722905 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-scripts\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.723978 master-2 kubenswrapper[4762]: I1014 13:38:04.723909 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-config-data\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.724306 master-2 kubenswrapper[4762]: I1014 13:38:04.724024 4762 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 14 13:38:04.724306 master-2 kubenswrapper[4762]: I1014 13:38:04.724066 4762 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/85c0fdc6533d16267c0e554f30a4834f2259832a15c3a4c0788a47abd2eca85c/globalmount\"" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.726470 master-2 kubenswrapper[4762]: I1014 13:38:04.726433 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-internal-tls-certs\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:04.750604 master-2 kubenswrapper[4762]: I1014 13:38:04.750503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6r8\" (UniqueName: \"kubernetes.io/projected/3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2-kube-api-access-kj6r8\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:05.556622 master-2 kubenswrapper[4762]: I1014 13:38:05.556556 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc23c12-5074-4846-8975-0ce11de825bc" path="/var/lib/kubelet/pods/cbc23c12-5074-4846-8975-0ce11de825bc/volumes" Oct 14 13:38:06.109302 master-2 kubenswrapper[4762]: I1014 13:38:06.109240 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ebefd099-2104-4e7d-ab98-cff57fdf6cf2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a7e8f86a-245f-40eb-b95a-9113ac1b399e\") pod \"glance-46645-default-internal-api-0\" (UID: \"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2\") " pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:06.277873 master-2 kubenswrapper[4762]: I1014 13:38:06.277774 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:07.046412 master-2 kubenswrapper[4762]: W1014 13:38:07.046345 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bce4bd8_dcdc_4f5e_9b58_b698c97de5a2.slice/crio-7bec3a5250533d7a69dcfd1d6022d987f5459c1d89737efca2ca558f25872ea4 WatchSource:0}: Error finding container 7bec3a5250533d7a69dcfd1d6022d987f5459c1d89737efca2ca558f25872ea4: Status 404 returned error can't find the container with id 7bec3a5250533d7a69dcfd1d6022d987f5459c1d89737efca2ca558f25872ea4 Oct 14 13:38:07.109429 master-2 kubenswrapper[4762]: I1014 13:38:07.109006 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-46645-default-internal-api-0"] Oct 14 13:38:07.389656 master-2 kubenswrapper[4762]: I1014 13:38:07.389473 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2","Type":"ContainerStarted","Data":"7bec3a5250533d7a69dcfd1d6022d987f5459c1d89737efca2ca558f25872ea4"} Oct 14 13:38:08.398111 master-2 kubenswrapper[4762]: I1014 13:38:08.398041 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2","Type":"ContainerStarted","Data":"ec3fe38ea14edeb100b45cf061bb2c4201ba760cba980422b9124e63ebc37efd"} Oct 14 13:38:09.422322 master-2 kubenswrapper[4762]: I1014 13:38:09.422252 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-46645-default-internal-api-0" event={"ID":"3bce4bd8-dcdc-4f5e-9b58-b698c97de5a2","Type":"ContainerStarted","Data":"8d3ede35366d376c2f7a855d4d3d659013d5dc38adc7dd40f7bd42251778f37a"} Oct 14 13:38:09.459731 master-2 kubenswrapper[4762]: I1014 13:38:09.459648 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-46645-default-internal-api-0" podStartSLOduration=5.45962339 podStartE2EDuration="5.45962339s" podCreationTimestamp="2025-10-14 13:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:09.453933093 +0000 UTC m=+1918.698092272" watchObservedRunningTime="2025-10-14 13:38:09.45962339 +0000 UTC m=+1918.703782539" Oct 14 13:38:13.018287 master-2 kubenswrapper[4762]: I1014 13:38:13.018187 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:38:13.117558 master-2 kubenswrapper[4762]: I1014 13:38:13.117115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom\") pod \"b08db4eb-a86d-436c-9251-0326ff980d24\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " Oct 14 13:38:13.117558 master-2 kubenswrapper[4762]: I1014 13:38:13.117215 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data\") pod \"b08db4eb-a86d-436c-9251-0326ff980d24\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " Oct 14 13:38:13.117558 master-2 kubenswrapper[4762]: I1014 13:38:13.117431 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzgbm\" (UniqueName: \"kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm\") pod \"b08db4eb-a86d-436c-9251-0326ff980d24\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " Oct 14 13:38:13.117558 master-2 kubenswrapper[4762]: I1014 13:38:13.117493 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle\") pod \"b08db4eb-a86d-436c-9251-0326ff980d24\" (UID: \"b08db4eb-a86d-436c-9251-0326ff980d24\") " Oct 14 13:38:13.120427 master-2 kubenswrapper[4762]: I1014 13:38:13.120364 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm" (OuterVolumeSpecName: "kube-api-access-fzgbm") pod "b08db4eb-a86d-436c-9251-0326ff980d24" (UID: "b08db4eb-a86d-436c-9251-0326ff980d24"). InnerVolumeSpecName "kube-api-access-fzgbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:13.122395 master-2 kubenswrapper[4762]: I1014 13:38:13.122338 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b08db4eb-a86d-436c-9251-0326ff980d24" (UID: "b08db4eb-a86d-436c-9251-0326ff980d24"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:13.153718 master-2 kubenswrapper[4762]: I1014 13:38:13.153648 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data" (OuterVolumeSpecName: "config-data") pod "b08db4eb-a86d-436c-9251-0326ff980d24" (UID: "b08db4eb-a86d-436c-9251-0326ff980d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:13.156444 master-2 kubenswrapper[4762]: I1014 13:38:13.156409 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b08db4eb-a86d-436c-9251-0326ff980d24" (UID: "b08db4eb-a86d-436c-9251-0326ff980d24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:13.226907 master-2 kubenswrapper[4762]: I1014 13:38:13.226813 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzgbm\" (UniqueName: \"kubernetes.io/projected/b08db4eb-a86d-436c-9251-0326ff980d24-kube-api-access-fzgbm\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:13.226907 master-2 kubenswrapper[4762]: I1014 13:38:13.226904 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:13.227490 master-2 kubenswrapper[4762]: I1014 13:38:13.226925 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data-custom\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:13.227490 master-2 kubenswrapper[4762]: I1014 13:38:13.226956 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08db4eb-a86d-436c-9251-0326ff980d24-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:13.477999 master-2 kubenswrapper[4762]: I1014 13:38:13.477840 4762 generic.go:334] "Generic (PLEG): container finished" podID="b08db4eb-a86d-436c-9251-0326ff980d24" containerID="b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347" exitCode=137 Oct 14 13:38:13.477999 master-2 kubenswrapper[4762]: I1014 13:38:13.477923 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" event={"ID":"b08db4eb-a86d-436c-9251-0326ff980d24","Type":"ContainerDied","Data":"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347"} Oct 14 13:38:13.478380 master-2 kubenswrapper[4762]: I1014 13:38:13.477997 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" Oct 14 13:38:13.478380 master-2 kubenswrapper[4762]: I1014 13:38:13.478029 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588df4cd6f-vxqld" event={"ID":"b08db4eb-a86d-436c-9251-0326ff980d24","Type":"ContainerDied","Data":"71f3ea437892612dbab6b76632320fc3c40478a6f3c9b5e8ffdd505e916227d1"} Oct 14 13:38:13.478380 master-2 kubenswrapper[4762]: I1014 13:38:13.478060 4762 scope.go:117] "RemoveContainer" containerID="b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347" Oct 14 13:38:13.480656 master-2 kubenswrapper[4762]: I1014 13:38:13.480613 4762 generic.go:334] "Generic (PLEG): container finished" podID="41ecfc00-c91d-4ca3-93af-8897072115cc" containerID="9ec45cda7dc1b8ec193a280be686462387e731fd2aca13027de3b37f908dbcb8" exitCode=0 Oct 14 13:38:13.480726 master-2 kubenswrapper[4762]: I1014 13:38:13.480662 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shgdv" event={"ID":"41ecfc00-c91d-4ca3-93af-8897072115cc","Type":"ContainerDied","Data":"9ec45cda7dc1b8ec193a280be686462387e731fd2aca13027de3b37f908dbcb8"} Oct 14 13:38:13.519937 master-2 kubenswrapper[4762]: I1014 13:38:13.519865 4762 scope.go:117] "RemoveContainer" containerID="b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347" Oct 14 13:38:13.520770 master-2 kubenswrapper[4762]: E1014 13:38:13.520719 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347\": container with ID starting with b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347 not found: ID does not exist" containerID="b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347" Oct 14 13:38:13.520878 master-2 kubenswrapper[4762]: I1014 13:38:13.520762 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347"} err="failed to get container status \"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347\": rpc error: code = NotFound desc = could not find container \"b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347\": container with ID starting with b07c87a64f1c8282ad66e638726d108541926286d29d7a702fcfc3b135bbb347 not found: ID does not exist" Oct 14 13:38:13.545736 master-2 kubenswrapper[4762]: I1014 13:38:13.545631 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:38:13.564140 master-2 kubenswrapper[4762]: I1014 13:38:13.564068 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-588df4cd6f-vxqld"] Oct 14 13:38:14.303234 master-2 kubenswrapper[4762]: I1014 13:38:14.303151 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-pkrrv"] Oct 14 13:38:14.303912 master-2 kubenswrapper[4762]: E1014 13:38:14.303483 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08db4eb-a86d-436c-9251-0326ff980d24" containerName="heat-cfnapi" Oct 14 13:38:14.303912 master-2 kubenswrapper[4762]: I1014 13:38:14.303497 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08db4eb-a86d-436c-9251-0326ff980d24" containerName="heat-cfnapi" Oct 14 13:38:14.303912 master-2 kubenswrapper[4762]: I1014 13:38:14.303710 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08db4eb-a86d-436c-9251-0326ff980d24" containerName="heat-cfnapi" Oct 14 13:38:14.304484 master-2 kubenswrapper[4762]: I1014 13:38:14.304449 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.307418 master-2 kubenswrapper[4762]: I1014 13:38:14.307366 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 14 13:38:14.307769 master-2 kubenswrapper[4762]: I1014 13:38:14.307701 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 14 13:38:14.322345 master-2 kubenswrapper[4762]: I1014 13:38:14.322294 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pkrrv"] Oct 14 13:38:14.346570 master-2 kubenswrapper[4762]: I1014 13:38:14.346508 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.347051 master-2 kubenswrapper[4762]: I1014 13:38:14.346655 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6jb\" (UniqueName: \"kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.347051 master-2 kubenswrapper[4762]: I1014 13:38:14.346688 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.347051 master-2 kubenswrapper[4762]: I1014 13:38:14.346741 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.449818 master-2 kubenswrapper[4762]: I1014 13:38:14.449750 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.450107 master-2 kubenswrapper[4762]: I1014 13:38:14.449893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.450107 master-2 kubenswrapper[4762]: I1014 13:38:14.450078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6jb\" (UniqueName: \"kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.450267 master-2 kubenswrapper[4762]: I1014 13:38:14.450122 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.454460 master-2 kubenswrapper[4762]: I1014 13:38:14.454421 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.455412 master-2 kubenswrapper[4762]: I1014 13:38:14.455348 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.456860 master-2 kubenswrapper[4762]: I1014 13:38:14.456838 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.483336 master-2 kubenswrapper[4762]: I1014 13:38:14.483292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6jb\" (UniqueName: \"kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb\") pod \"aodh-db-sync-pkrrv\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.504998 master-2 kubenswrapper[4762]: I1014 13:38:14.504941 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d4da85-9147-43ff-b96c-81e9d3fffd69" containerID="2120c53f4048f99f1477c2478abbbf6df26150910844b8bebb3bb82702165818" exitCode=0 Oct 14 13:38:14.505263 master-2 kubenswrapper[4762]: I1014 13:38:14.505227 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerDied","Data":"2120c53f4048f99f1477c2478abbbf6df26150910844b8bebb3bb82702165818"} Oct 14 13:38:14.624547 master-2 kubenswrapper[4762]: I1014 13:38:14.624018 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:14.867194 master-2 kubenswrapper[4762]: I1014 13:38:14.867124 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:38:14.964404 master-2 kubenswrapper[4762]: I1014 13:38:14.964334 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5x5l\" (UniqueName: \"kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l\") pod \"41ecfc00-c91d-4ca3-93af-8897072115cc\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " Oct 14 13:38:14.964721 master-2 kubenswrapper[4762]: I1014 13:38:14.964453 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data\") pod \"41ecfc00-c91d-4ca3-93af-8897072115cc\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " Oct 14 13:38:14.964721 master-2 kubenswrapper[4762]: I1014 13:38:14.964505 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle\") pod \"41ecfc00-c91d-4ca3-93af-8897072115cc\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " Oct 14 13:38:14.964721 master-2 kubenswrapper[4762]: I1014 13:38:14.964655 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts\") pod \"41ecfc00-c91d-4ca3-93af-8897072115cc\" (UID: \"41ecfc00-c91d-4ca3-93af-8897072115cc\") " Oct 14 13:38:14.970542 master-2 kubenswrapper[4762]: I1014 13:38:14.970418 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts" (OuterVolumeSpecName: "scripts") pod "41ecfc00-c91d-4ca3-93af-8897072115cc" (UID: "41ecfc00-c91d-4ca3-93af-8897072115cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:14.970542 master-2 kubenswrapper[4762]: I1014 13:38:14.970491 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l" (OuterVolumeSpecName: "kube-api-access-l5x5l") pod "41ecfc00-c91d-4ca3-93af-8897072115cc" (UID: "41ecfc00-c91d-4ca3-93af-8897072115cc"). InnerVolumeSpecName "kube-api-access-l5x5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:14.991447 master-2 kubenswrapper[4762]: I1014 13:38:14.991357 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data" (OuterVolumeSpecName: "config-data") pod "41ecfc00-c91d-4ca3-93af-8897072115cc" (UID: "41ecfc00-c91d-4ca3-93af-8897072115cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:15.000242 master-2 kubenswrapper[4762]: I1014 13:38:14.998132 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ecfc00-c91d-4ca3-93af-8897072115cc" (UID: "41ecfc00-c91d-4ca3-93af-8897072115cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:15.066112 master-2 kubenswrapper[4762]: I1014 13:38:15.066057 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:15.066112 master-2 kubenswrapper[4762]: I1014 13:38:15.066100 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:15.066112 master-2 kubenswrapper[4762]: I1014 13:38:15.066110 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5x5l\" (UniqueName: \"kubernetes.io/projected/41ecfc00-c91d-4ca3-93af-8897072115cc-kube-api-access-l5x5l\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:15.066112 master-2 kubenswrapper[4762]: I1014 13:38:15.066120 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecfc00-c91d-4ca3-93af-8897072115cc-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:15.085851 master-2 kubenswrapper[4762]: I1014 13:38:15.085796 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pkrrv"] Oct 14 13:38:15.090657 master-2 kubenswrapper[4762]: W1014 13:38:15.089181 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e8aee3_10b8_4420_bacc_83d4d9e9e205.slice/crio-d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836 WatchSource:0}: Error finding container d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836: Status 404 returned error can't find the container with id d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836 Oct 14 13:38:15.522447 master-2 kubenswrapper[4762]: I1014 13:38:15.522405 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pkrrv" event={"ID":"88e8aee3-10b8-4420-bacc-83d4d9e9e205","Type":"ContainerStarted","Data":"d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836"} Oct 14 13:38:15.524052 master-2 kubenswrapper[4762]: I1014 13:38:15.524018 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-shgdv" Oct 14 13:38:15.524122 master-2 kubenswrapper[4762]: I1014 13:38:15.524024 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-shgdv" event={"ID":"41ecfc00-c91d-4ca3-93af-8897072115cc","Type":"ContainerDied","Data":"177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062"} Oct 14 13:38:15.524122 master-2 kubenswrapper[4762]: I1014 13:38:15.524082 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="177774d9196c59941c0f748e38cb82cd6b530e5ed0138049e373d8b684071062" Oct 14 13:38:15.527586 master-2 kubenswrapper[4762]: I1014 13:38:15.527515 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"012e2b4af20549c5ad753b5296cdb74d8b3acb8561c0957215892aff7908ac54"} Oct 14 13:38:15.527586 master-2 kubenswrapper[4762]: I1014 13:38:15.527563 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"c5f88a9c0f7810c8104ea3fe094f5b21505f55022bc32870c0b3cb364f0388cd"} Oct 14 13:38:15.568550 master-2 kubenswrapper[4762]: I1014 13:38:15.568365 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08db4eb-a86d-436c-9251-0326ff980d24" path="/var/lib/kubelet/pods/b08db4eb-a86d-436c-9251-0326ff980d24/volumes" Oct 14 13:38:15.764779 master-2 kubenswrapper[4762]: I1014 13:38:15.764728 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:38:15.765431 master-2 kubenswrapper[4762]: E1014 13:38:15.765409 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ecfc00-c91d-4ca3-93af-8897072115cc" containerName="nova-cell0-conductor-db-sync" Oct 14 13:38:15.765540 master-2 kubenswrapper[4762]: I1014 13:38:15.765526 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ecfc00-c91d-4ca3-93af-8897072115cc" containerName="nova-cell0-conductor-db-sync" Oct 14 13:38:15.765814 master-2 kubenswrapper[4762]: I1014 13:38:15.765796 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ecfc00-c91d-4ca3-93af-8897072115cc" containerName="nova-cell0-conductor-db-sync" Oct 14 13:38:15.766685 master-2 kubenswrapper[4762]: I1014 13:38:15.766664 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.769419 master-2 kubenswrapper[4762]: I1014 13:38:15.769393 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 14 13:38:15.792145 master-2 kubenswrapper[4762]: I1014 13:38:15.792077 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:38:15.883876 master-2 kubenswrapper[4762]: I1014 13:38:15.883826 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.884133 master-2 kubenswrapper[4762]: I1014 13:38:15.884016 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.884133 master-2 kubenswrapper[4762]: I1014 13:38:15.884056 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjqx\" (UniqueName: \"kubernetes.io/projected/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-kube-api-access-cbjqx\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.985692 master-2 kubenswrapper[4762]: I1014 13:38:15.985618 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjqx\" (UniqueName: \"kubernetes.io/projected/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-kube-api-access-cbjqx\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.985692 master-2 kubenswrapper[4762]: I1014 13:38:15.985710 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.986242 master-2 kubenswrapper[4762]: I1014 13:38:15.985898 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:15.990340 master-2 kubenswrapper[4762]: I1014 13:38:15.990292 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:16.004137 master-2 kubenswrapper[4762]: I1014 13:38:15.991484 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:16.011825 master-2 kubenswrapper[4762]: I1014 13:38:16.011747 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjqx\" (UniqueName: \"kubernetes.io/projected/98bfb81e-57cc-4d63-a98d-2d80924d0ae4-kube-api-access-cbjqx\") pod \"nova-cell0-conductor-0\" (UID: \"98bfb81e-57cc-4d63-a98d-2d80924d0ae4\") " pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:16.090044 master-2 kubenswrapper[4762]: I1014 13:38:16.089882 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:16.278328 master-2 kubenswrapper[4762]: I1014 13:38:16.278257 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.278328 master-2 kubenswrapper[4762]: I1014 13:38:16.278336 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.312638 master-2 kubenswrapper[4762]: I1014 13:38:16.312574 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.327899 master-2 kubenswrapper[4762]: I1014 13:38:16.327835 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.547989 master-2 kubenswrapper[4762]: I1014 13:38:16.547811 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"d0d4da85-9147-43ff-b96c-81e9d3fffd69","Type":"ContainerStarted","Data":"d13baaea722830bc8d7bc3fc2b5201972ca4f9187dbc8eef446e60d2aa7c6d7c"} Oct 14 13:38:16.548996 master-2 kubenswrapper[4762]: I1014 13:38:16.548425 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.548996 master-2 kubenswrapper[4762]: I1014 13:38:16.548570 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:16.572893 master-2 kubenswrapper[4762]: I1014 13:38:16.558260 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 14 13:38:16.604675 master-2 kubenswrapper[4762]: I1014 13:38:16.598464 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=57.038877952 podStartE2EDuration="1m40.598434501s" podCreationTimestamp="2025-10-14 13:36:36 +0000 UTC" firstStartedPulling="2025-10-14 13:36:49.027839839 +0000 UTC m=+1838.271998998" lastFinishedPulling="2025-10-14 13:37:32.587396388 +0000 UTC m=+1881.831555547" observedRunningTime="2025-10-14 13:38:16.58877507 +0000 UTC m=+1925.832934239" watchObservedRunningTime="2025-10-14 13:38:16.598434501 +0000 UTC m=+1925.842593660" Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.583957 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"98bfb81e-57cc-4d63-a98d-2d80924d0ae4","Type":"ContainerStarted","Data":"bf1aa3b57894c19e41a15e361e6541bf4126fc09e0aba1d228190c5f69dcacba"} Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.584002 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"98bfb81e-57cc-4d63-a98d-2d80924d0ae4","Type":"ContainerStarted","Data":"a63ca98a5aa1df0321533ab01f694675b5ff987e4243c8a936bc4f3ebf91d5f9"} Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.584016 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.584027 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.584035 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Oct 14 13:38:17.584318 master-2 kubenswrapper[4762]: I1014 13:38:17.584055 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:17.610113 master-2 kubenswrapper[4762]: I1014 13:38:17.610026 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.610006518 podStartE2EDuration="2.610006518s" podCreationTimestamp="2025-10-14 13:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:17.608103828 +0000 UTC m=+1926.852262987" watchObservedRunningTime="2025-10-14 13:38:17.610006518 +0000 UTC m=+1926.854165677" Oct 14 13:38:18.508871 master-2 kubenswrapper[4762]: I1014 13:38:18.508736 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:18.592174 master-2 kubenswrapper[4762]: I1014 13:38:18.591949 4762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 14 13:38:18.628046 master-2 kubenswrapper[4762]: I1014 13:38:18.627988 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-46645-default-internal-api-0" Oct 14 13:38:18.633722 master-2 kubenswrapper[4762]: I1014 13:38:18.633668 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Oct 14 13:38:19.626425 master-2 kubenswrapper[4762]: I1014 13:38:19.626359 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Oct 14 13:38:20.611774 master-2 kubenswrapper[4762]: I1014 13:38:20.611707 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pkrrv" event={"ID":"88e8aee3-10b8-4420-bacc-83d4d9e9e205","Type":"ContainerStarted","Data":"dc7715cef141a98b2ad6e2582578b153414d01e201310b6d25f0c662df4f5a0d"} Oct 14 13:38:20.615913 master-2 kubenswrapper[4762]: I1014 13:38:20.615856 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Oct 14 13:38:20.648350 master-2 kubenswrapper[4762]: I1014 13:38:20.648263 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-pkrrv" podStartSLOduration=1.978851436 podStartE2EDuration="6.648239449s" podCreationTimestamp="2025-10-14 13:38:14 +0000 UTC" firstStartedPulling="2025-10-14 13:38:15.091579243 +0000 UTC m=+1924.335738402" lastFinishedPulling="2025-10-14 13:38:19.760967256 +0000 UTC m=+1929.005126415" observedRunningTime="2025-10-14 13:38:20.641730935 +0000 UTC m=+1929.885890124" watchObservedRunningTime="2025-10-14 13:38:20.648239449 +0000 UTC m=+1929.892398608" Oct 14 13:38:21.116746 master-2 kubenswrapper[4762]: I1014 13:38:21.116683 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 14 13:38:21.912973 master-2 kubenswrapper[4762]: I1014 13:38:21.912892 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-99b9d"] Oct 14 13:38:21.914385 master-2 kubenswrapper[4762]: I1014 13:38:21.914359 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:21.917286 master-2 kubenswrapper[4762]: I1014 13:38:21.917248 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 14 13:38:21.917900 master-2 kubenswrapper[4762]: I1014 13:38:21.917870 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 14 13:38:21.936520 master-2 kubenswrapper[4762]: I1014 13:38:21.936422 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-99b9d"] Oct 14 13:38:22.017371 master-2 kubenswrapper[4762]: I1014 13:38:22.017290 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.017845 master-2 kubenswrapper[4762]: I1014 13:38:22.017766 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2qx\" (UniqueName: \"kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.018031 master-2 kubenswrapper[4762]: I1014 13:38:22.018003 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.018236 master-2 kubenswrapper[4762]: I1014 13:38:22.018204 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.120091 master-2 kubenswrapper[4762]: I1014 13:38:22.120020 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.120452 master-2 kubenswrapper[4762]: I1014 13:38:22.120436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2qx\" (UniqueName: \"kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.120676 master-2 kubenswrapper[4762]: I1014 13:38:22.120661 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.121316 master-2 kubenswrapper[4762]: I1014 13:38:22.121297 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.126965 master-2 kubenswrapper[4762]: I1014 13:38:22.126932 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.127238 master-2 kubenswrapper[4762]: I1014 13:38:22.127198 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.130250 master-2 kubenswrapper[4762]: I1014 13:38:22.130204 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.144733 master-2 kubenswrapper[4762]: I1014 13:38:22.144698 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2qx\" (UniqueName: \"kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx\") pod \"nova-cell0-cell-mapping-99b9d\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.232235 master-2 kubenswrapper[4762]: I1014 13:38:22.232081 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:22.270788 master-2 kubenswrapper[4762]: I1014 13:38:22.270736 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:38:22.272930 master-2 kubenswrapper[4762]: I1014 13:38:22.272904 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:38:22.276870 master-2 kubenswrapper[4762]: I1014 13:38:22.276836 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:38:22.298706 master-2 kubenswrapper[4762]: I1014 13:38:22.296383 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:38:22.385072 master-2 kubenswrapper[4762]: I1014 13:38:22.385020 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:38:22.386736 master-2 kubenswrapper[4762]: I1014 13:38:22.386718 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:38:22.394614 master-2 kubenswrapper[4762]: I1014 13:38:22.394566 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:38:22.427248 master-2 kubenswrapper[4762]: I1014 13:38:22.416033 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:38:22.436583 master-2 kubenswrapper[4762]: I1014 13:38:22.436532 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.436687 master-2 kubenswrapper[4762]: I1014 13:38:22.436594 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv8kj\" (UniqueName: \"kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.436806 master-2 kubenswrapper[4762]: I1014 13:38:22.436756 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.436852 master-2 kubenswrapper[4762]: I1014 13:38:22.436842 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.539354 master-2 kubenswrapper[4762]: I1014 13:38:22.539191 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.539354 master-2 kubenswrapper[4762]: I1014 13:38:22.539335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.539651 master-2 kubenswrapper[4762]: I1014 13:38:22.539566 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.539651 master-2 kubenswrapper[4762]: I1014 13:38:22.539613 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldv9\" (UniqueName: \"kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.539755 master-2 kubenswrapper[4762]: I1014 13:38:22.539652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv8kj\" (UniqueName: \"kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.539755 master-2 kubenswrapper[4762]: I1014 13:38:22.539689 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.539755 master-2 kubenswrapper[4762]: I1014 13:38:22.539728 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.539889 master-2 kubenswrapper[4762]: I1014 13:38:22.539762 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.543917 master-2 kubenswrapper[4762]: I1014 13:38:22.543858 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.544764 master-2 kubenswrapper[4762]: I1014 13:38:22.544715 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.643545 master-2 kubenswrapper[4762]: I1014 13:38:22.643436 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldv9\" (UniqueName: \"kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.643545 master-2 kubenswrapper[4762]: I1014 13:38:22.643531 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.643852 master-2 kubenswrapper[4762]: I1014 13:38:22.643617 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.652890 master-2 kubenswrapper[4762]: I1014 13:38:22.648979 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.654086 master-2 kubenswrapper[4762]: I1014 13:38:22.653998 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.669207 master-2 kubenswrapper[4762]: I1014 13:38:22.669053 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv8kj\" (UniqueName: \"kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj\") pod \"nova-api-0\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " pod="openstack/nova-api-0" Oct 14 13:38:22.684386 master-2 kubenswrapper[4762]: I1014 13:38:22.684311 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldv9\" (UniqueName: \"kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9\") pod \"nova-scheduler-0\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " pod="openstack/nova-scheduler-0" Oct 14 13:38:22.719483 master-2 kubenswrapper[4762]: I1014 13:38:22.719142 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:38:22.733635 master-2 kubenswrapper[4762]: I1014 13:38:22.730717 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:38:22.733635 master-2 kubenswrapper[4762]: I1014 13:38:22.733536 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:38:22.737916 master-2 kubenswrapper[4762]: I1014 13:38:22.737625 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:38:22.761189 master-2 kubenswrapper[4762]: I1014 13:38:22.761113 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:38:22.823114 master-2 kubenswrapper[4762]: I1014 13:38:22.823043 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-99b9d"] Oct 14 13:38:22.847490 master-2 kubenswrapper[4762]: I1014 13:38:22.847415 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.847707 master-2 kubenswrapper[4762]: I1014 13:38:22.847503 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.847707 master-2 kubenswrapper[4762]: I1014 13:38:22.847586 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.847707 master-2 kubenswrapper[4762]: I1014 13:38:22.847638 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb65w\" (UniqueName: \"kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.905790 master-2 kubenswrapper[4762]: I1014 13:38:22.905738 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:38:22.907544 master-2 kubenswrapper[4762]: I1014 13:38:22.907508 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:22.909854 master-2 kubenswrapper[4762]: I1014 13:38:22.909769 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 14 13:38:22.909854 master-2 kubenswrapper[4762]: I1014 13:38:22.909956 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 14 13:38:22.909854 master-2 kubenswrapper[4762]: I1014 13:38:22.910075 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 14 13:38:22.909854 master-2 kubenswrapper[4762]: I1014 13:38:22.910142 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 14 13:38:22.909854 master-2 kubenswrapper[4762]: I1014 13:38:22.910903 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 14 13:38:22.924568 master-2 kubenswrapper[4762]: I1014 13:38:22.924500 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:38:22.949297 master-2 kubenswrapper[4762]: I1014 13:38:22.949247 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.949394 master-2 kubenswrapper[4762]: I1014 13:38:22.949350 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb65w\" (UniqueName: \"kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.949542 master-2 kubenswrapper[4762]: I1014 13:38:22.949519 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.949634 master-2 kubenswrapper[4762]: I1014 13:38:22.949611 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.951196 master-2 kubenswrapper[4762]: I1014 13:38:22.951139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.957770 master-2 kubenswrapper[4762]: I1014 13:38:22.957734 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.962369 master-2 kubenswrapper[4762]: I1014 13:38:22.961139 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:22.967800 master-2 kubenswrapper[4762]: I1014 13:38:22.967723 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:38:22.981359 master-2 kubenswrapper[4762]: I1014 13:38:22.981313 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb65w\" (UniqueName: \"kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w\") pod \"nova-metadata-0\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " pod="openstack/nova-metadata-0" Oct 14 13:38:23.051931 master-2 kubenswrapper[4762]: I1014 13:38:23.051736 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.051931 master-2 kubenswrapper[4762]: I1014 13:38:23.051850 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.051931 master-2 kubenswrapper[4762]: I1014 13:38:23.051919 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.052091 master-2 kubenswrapper[4762]: I1014 13:38:23.051942 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.052091 master-2 kubenswrapper[4762]: I1014 13:38:23.052063 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.052194 master-2 kubenswrapper[4762]: I1014 13:38:23.052137 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h78r5\" (UniqueName: \"kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.083765 master-2 kubenswrapper[4762]: I1014 13:38:23.082625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:38:23.092474 master-2 kubenswrapper[4762]: I1014 13:38:23.092419 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2knq"] Oct 14 13:38:23.093514 master-2 kubenswrapper[4762]: I1014 13:38:23.093480 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.095753 master-2 kubenswrapper[4762]: I1014 13:38:23.095726 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 14 13:38:23.096108 master-2 kubenswrapper[4762]: I1014 13:38:23.096068 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 14 13:38:23.154255 master-2 kubenswrapper[4762]: I1014 13:38:23.154116 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2knq"] Oct 14 13:38:23.154255 master-2 kubenswrapper[4762]: I1014 13:38:23.154230 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.154517 master-2 kubenswrapper[4762]: I1014 13:38:23.154332 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.154517 master-2 kubenswrapper[4762]: I1014 13:38:23.154406 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.154517 master-2 kubenswrapper[4762]: I1014 13:38:23.154462 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.154517 master-2 kubenswrapper[4762]: I1014 13:38:23.154500 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.154642 master-2 kubenswrapper[4762]: I1014 13:38:23.154527 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h78r5\" (UniqueName: \"kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.154642 master-2 kubenswrapper[4762]: I1014 13:38:23.154571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xj6d\" (UniqueName: \"kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.154642 master-2 kubenswrapper[4762]: I1014 13:38:23.154597 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.154642 master-2 kubenswrapper[4762]: I1014 13:38:23.154632 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.154760 master-2 kubenswrapper[4762]: I1014 13:38:23.154676 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.156974 master-2 kubenswrapper[4762]: I1014 13:38:23.155699 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.156974 master-2 kubenswrapper[4762]: I1014 13:38:23.155887 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.156974 master-2 kubenswrapper[4762]: I1014 13:38:23.156604 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.156974 master-2 kubenswrapper[4762]: I1014 13:38:23.156699 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.157269 master-2 kubenswrapper[4762]: I1014 13:38:23.157236 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.178248 master-2 kubenswrapper[4762]: I1014 13:38:23.178197 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h78r5\" (UniqueName: \"kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5\") pod \"dnsmasq-dns-6cd59f759-z4zsj\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.250190 master-2 kubenswrapper[4762]: I1014 13:38:23.250116 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:38:23.257095 master-2 kubenswrapper[4762]: I1014 13:38:23.254938 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:23.257095 master-2 kubenswrapper[4762]: I1014 13:38:23.256409 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.257095 master-2 kubenswrapper[4762]: I1014 13:38:23.256478 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.257095 master-2 kubenswrapper[4762]: I1014 13:38:23.256517 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xj6d\" (UniqueName: \"kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.257095 master-2 kubenswrapper[4762]: I1014 13:38:23.256544 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.267474 master-2 kubenswrapper[4762]: I1014 13:38:23.265709 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.273289 master-2 kubenswrapper[4762]: I1014 13:38:23.270485 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.279076 master-2 kubenswrapper[4762]: I1014 13:38:23.278676 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.295081 master-2 kubenswrapper[4762]: I1014 13:38:23.295028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xj6d\" (UniqueName: \"kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d\") pod \"nova-cell1-conductor-db-sync-x2knq\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.479934 master-2 kubenswrapper[4762]: I1014 13:38:23.479875 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:38:23.486743 master-2 kubenswrapper[4762]: W1014 13:38:23.486677 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a37f47c_bdf2_4fc0_89a6_622a690a7a41.slice/crio-d5de9068b3772adbdeee5272078590424cbc4f6bdbaebb5f7ed07df02ee4baf6 WatchSource:0}: Error finding container d5de9068b3772adbdeee5272078590424cbc4f6bdbaebb5f7ed07df02ee4baf6: Status 404 returned error can't find the container with id d5de9068b3772adbdeee5272078590424cbc4f6bdbaebb5f7ed07df02ee4baf6 Oct 14 13:38:23.528748 master-2 kubenswrapper[4762]: I1014 13:38:23.528671 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:23.647237 master-2 kubenswrapper[4762]: I1014 13:38:23.647186 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:38:23.723284 master-2 kubenswrapper[4762]: I1014 13:38:23.721637 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b4493-b9c4-457d-a3d2-9d751679cd45","Type":"ContainerStarted","Data":"27a8c9c706a539b4ce2e55e7f67be3108d48dbff3e6f0ace74560bb1bcd9f957"} Oct 14 13:38:23.725384 master-2 kubenswrapper[4762]: I1014 13:38:23.725348 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-99b9d" event={"ID":"5ea4e416-c6ee-4940-a9f8-2b2265d16336","Type":"ContainerStarted","Data":"641aabad2340dab2ce0051c53eef6dcf2a4b14b69c84d35539c817f74acfb1ac"} Oct 14 13:38:23.725384 master-2 kubenswrapper[4762]: I1014 13:38:23.725386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-99b9d" event={"ID":"5ea4e416-c6ee-4940-a9f8-2b2265d16336","Type":"ContainerStarted","Data":"af63048068679994190947a5cd01aeaa22ebfefa4c7d5d56391d2e2f5115d6e2"} Oct 14 13:38:23.728529 master-2 kubenswrapper[4762]: I1014 13:38:23.727992 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerStarted","Data":"d5de9068b3772adbdeee5272078590424cbc4f6bdbaebb5f7ed07df02ee4baf6"} Oct 14 13:38:23.741271 master-2 kubenswrapper[4762]: I1014 13:38:23.741008 4762 generic.go:334] "Generic (PLEG): container finished" podID="88e8aee3-10b8-4420-bacc-83d4d9e9e205" containerID="dc7715cef141a98b2ad6e2582578b153414d01e201310b6d25f0c662df4f5a0d" exitCode=0 Oct 14 13:38:23.741271 master-2 kubenswrapper[4762]: I1014 13:38:23.741113 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pkrrv" event={"ID":"88e8aee3-10b8-4420-bacc-83d4d9e9e205","Type":"ContainerDied","Data":"dc7715cef141a98b2ad6e2582578b153414d01e201310b6d25f0c662df4f5a0d"} Oct 14 13:38:23.745915 master-2 kubenswrapper[4762]: I1014 13:38:23.745828 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerStarted","Data":"75d8f3ded3fc825a61198aaa67a506bfde480ac8f25f1d79f648b0577ccfe985"} Oct 14 13:38:23.768743 master-2 kubenswrapper[4762]: I1014 13:38:23.768629 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-99b9d" podStartSLOduration=2.768600209 podStartE2EDuration="2.768600209s" podCreationTimestamp="2025-10-14 13:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:23.762201619 +0000 UTC m=+1933.006360808" watchObservedRunningTime="2025-10-14 13:38:23.768600209 +0000 UTC m=+1933.012759378" Oct 14 13:38:23.782026 master-2 kubenswrapper[4762]: I1014 13:38:23.781972 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:38:24.011879 master-2 kubenswrapper[4762]: W1014 13:38:24.011835 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6bc42a1_2444_47ab_8922_267ae995d2cc.slice/crio-852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70 WatchSource:0}: Error finding container 852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70: Status 404 returned error can't find the container with id 852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70 Oct 14 13:38:24.014815 master-2 kubenswrapper[4762]: I1014 13:38:24.014779 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2knq"] Oct 14 13:38:24.756955 master-2 kubenswrapper[4762]: I1014 13:38:24.756871 4762 generic.go:334] "Generic (PLEG): container finished" podID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerID="66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1" exitCode=0 Oct 14 13:38:24.756955 master-2 kubenswrapper[4762]: I1014 13:38:24.756955 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" event={"ID":"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a","Type":"ContainerDied","Data":"66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1"} Oct 14 13:38:24.757292 master-2 kubenswrapper[4762]: I1014 13:38:24.756982 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" event={"ID":"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a","Type":"ContainerStarted","Data":"3a822dde7cff55ffa8d510ac7b5544a360075a584e1aebb4e5037c12501f063b"} Oct 14 13:38:24.758919 master-2 kubenswrapper[4762]: I1014 13:38:24.758850 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2knq" event={"ID":"e6bc42a1-2444-47ab-8922-267ae995d2cc","Type":"ContainerStarted","Data":"e9867cecc44d38717730048ad6937f254f9da5ca955e38dc5fa42ae8488d1044"} Oct 14 13:38:24.758919 master-2 kubenswrapper[4762]: I1014 13:38:24.758901 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2knq" event={"ID":"e6bc42a1-2444-47ab-8922-267ae995d2cc","Type":"ContainerStarted","Data":"852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70"} Oct 14 13:38:26.026308 master-2 kubenswrapper[4762]: I1014 13:38:26.026216 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:26.105203 master-2 kubenswrapper[4762]: I1014 13:38:26.102302 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-x2knq" podStartSLOduration=3.102280104 podStartE2EDuration="3.102280104s" podCreationTimestamp="2025-10-14 13:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:24.815496807 +0000 UTC m=+1934.059655976" watchObservedRunningTime="2025-10-14 13:38:26.102280104 +0000 UTC m=+1935.346439263" Oct 14 13:38:26.133638 master-2 kubenswrapper[4762]: I1014 13:38:26.133471 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle\") pod \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " Oct 14 13:38:26.133638 master-2 kubenswrapper[4762]: I1014 13:38:26.133552 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data\") pod \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " Oct 14 13:38:26.133928 master-2 kubenswrapper[4762]: I1014 13:38:26.133827 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6jb\" (UniqueName: \"kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb\") pod \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " Oct 14 13:38:26.133984 master-2 kubenswrapper[4762]: I1014 13:38:26.133934 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts\") pod \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\" (UID: \"88e8aee3-10b8-4420-bacc-83d4d9e9e205\") " Oct 14 13:38:26.141647 master-2 kubenswrapper[4762]: I1014 13:38:26.141594 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb" (OuterVolumeSpecName: "kube-api-access-zz6jb") pod "88e8aee3-10b8-4420-bacc-83d4d9e9e205" (UID: "88e8aee3-10b8-4420-bacc-83d4d9e9e205"). InnerVolumeSpecName "kube-api-access-zz6jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:26.141759 master-2 kubenswrapper[4762]: I1014 13:38:26.141722 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts" (OuterVolumeSpecName: "scripts") pod "88e8aee3-10b8-4420-bacc-83d4d9e9e205" (UID: "88e8aee3-10b8-4420-bacc-83d4d9e9e205"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:26.173690 master-2 kubenswrapper[4762]: I1014 13:38:26.173594 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data" (OuterVolumeSpecName: "config-data") pod "88e8aee3-10b8-4420-bacc-83d4d9e9e205" (UID: "88e8aee3-10b8-4420-bacc-83d4d9e9e205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:26.236692 master-2 kubenswrapper[4762]: I1014 13:38:26.236620 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6jb\" (UniqueName: \"kubernetes.io/projected/88e8aee3-10b8-4420-bacc-83d4d9e9e205-kube-api-access-zz6jb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:26.236692 master-2 kubenswrapper[4762]: I1014 13:38:26.236683 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:26.236692 master-2 kubenswrapper[4762]: I1014 13:38:26.236694 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:26.246305 master-2 kubenswrapper[4762]: I1014 13:38:26.246238 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88e8aee3-10b8-4420-bacc-83d4d9e9e205" (UID: "88e8aee3-10b8-4420-bacc-83d4d9e9e205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:26.338985 master-2 kubenswrapper[4762]: I1014 13:38:26.338925 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e8aee3-10b8-4420-bacc-83d4d9e9e205-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:26.780373 master-2 kubenswrapper[4762]: I1014 13:38:26.780322 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b4493-b9c4-457d-a3d2-9d751679cd45","Type":"ContainerStarted","Data":"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8"} Oct 14 13:38:26.783650 master-2 kubenswrapper[4762]: I1014 13:38:26.783609 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerStarted","Data":"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876"} Oct 14 13:38:26.783650 master-2 kubenswrapper[4762]: I1014 13:38:26.783641 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerStarted","Data":"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25"} Oct 14 13:38:26.785852 master-2 kubenswrapper[4762]: I1014 13:38:26.785821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pkrrv" event={"ID":"88e8aee3-10b8-4420-bacc-83d4d9e9e205","Type":"ContainerDied","Data":"d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836"} Oct 14 13:38:26.785852 master-2 kubenswrapper[4762]: I1014 13:38:26.785846 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9cb86e947d610415355c33f44993a1541156a4c29ef109f377c9d03f1e97836" Oct 14 13:38:26.786025 master-2 kubenswrapper[4762]: I1014 13:38:26.785890 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pkrrv" Oct 14 13:38:26.788821 master-2 kubenswrapper[4762]: I1014 13:38:26.788787 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerStarted","Data":"37f58c4430902f7cc815a2535a71deb9ecb0506e716fc8179301f3bac2def25f"} Oct 14 13:38:26.788821 master-2 kubenswrapper[4762]: I1014 13:38:26.788813 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerStarted","Data":"dc764dd0b9815f8afba97ac6c1fa5f82a8f91bdd751d70c869bf014f9115ce83"} Oct 14 13:38:26.792835 master-2 kubenswrapper[4762]: I1014 13:38:26.792802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" event={"ID":"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a","Type":"ContainerStarted","Data":"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9"} Oct 14 13:38:26.793346 master-2 kubenswrapper[4762]: I1014 13:38:26.793321 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:27.055255 master-2 kubenswrapper[4762]: I1014 13:38:27.054460 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.401592402 podStartE2EDuration="5.054438098s" podCreationTimestamp="2025-10-14 13:38:22 +0000 UTC" firstStartedPulling="2025-10-14 13:38:23.298859844 +0000 UTC m=+1932.543018993" lastFinishedPulling="2025-10-14 13:38:25.95170553 +0000 UTC m=+1935.195864689" observedRunningTime="2025-10-14 13:38:26.840731777 +0000 UTC m=+1936.084890966" watchObservedRunningTime="2025-10-14 13:38:27.054438098 +0000 UTC m=+1936.298597267" Oct 14 13:38:27.060792 master-2 kubenswrapper[4762]: I1014 13:38:27.060422 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" podStartSLOduration=5.060402654 podStartE2EDuration="5.060402654s" podCreationTimestamp="2025-10-14 13:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:26.882296692 +0000 UTC m=+1936.126455861" watchObservedRunningTime="2025-10-14 13:38:27.060402654 +0000 UTC m=+1936.304561813" Oct 14 13:38:27.484587 master-2 kubenswrapper[4762]: I1014 13:38:27.484503 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.172666213 podStartE2EDuration="5.484476046s" podCreationTimestamp="2025-10-14 13:38:22 +0000 UTC" firstStartedPulling="2025-10-14 13:38:23.642859589 +0000 UTC m=+1932.887018748" lastFinishedPulling="2025-10-14 13:38:25.954669422 +0000 UTC m=+1935.198828581" observedRunningTime="2025-10-14 13:38:27.38388743 +0000 UTC m=+1936.628046619" watchObservedRunningTime="2025-10-14 13:38:27.484476046 +0000 UTC m=+1936.728635215" Oct 14 13:38:27.491142 master-2 kubenswrapper[4762]: I1014 13:38:27.491025 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.024275415 podStartE2EDuration="5.491003359s" podCreationTimestamp="2025-10-14 13:38:22 +0000 UTC" firstStartedPulling="2025-10-14 13:38:23.488561948 +0000 UTC m=+1932.732721107" lastFinishedPulling="2025-10-14 13:38:25.955289892 +0000 UTC m=+1935.199449051" observedRunningTime="2025-10-14 13:38:27.484672722 +0000 UTC m=+1936.728831921" watchObservedRunningTime="2025-10-14 13:38:27.491003359 +0000 UTC m=+1936.735162518" Oct 14 13:38:27.719750 master-2 kubenswrapper[4762]: I1014 13:38:27.719682 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:38:28.083846 master-2 kubenswrapper[4762]: I1014 13:38:28.083782 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:38:28.084593 master-2 kubenswrapper[4762]: I1014 13:38:28.084575 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:38:28.171238 master-2 kubenswrapper[4762]: I1014 13:38:28.171082 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:38:30.831042 master-2 kubenswrapper[4762]: I1014 13:38:30.830988 4762 generic.go:334] "Generic (PLEG): container finished" podID="5ea4e416-c6ee-4940-a9f8-2b2265d16336" containerID="641aabad2340dab2ce0051c53eef6dcf2a4b14b69c84d35539c817f74acfb1ac" exitCode=0 Oct 14 13:38:30.831604 master-2 kubenswrapper[4762]: I1014 13:38:30.831045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-99b9d" event={"ID":"5ea4e416-c6ee-4940-a9f8-2b2265d16336","Type":"ContainerDied","Data":"641aabad2340dab2ce0051c53eef6dcf2a4b14b69c84d35539c817f74acfb1ac"} Oct 14 13:38:32.310697 master-2 kubenswrapper[4762]: I1014 13:38:32.310613 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:32.415583 master-2 kubenswrapper[4762]: I1014 13:38:32.415512 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts\") pod \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " Oct 14 13:38:32.415947 master-2 kubenswrapper[4762]: I1014 13:38:32.415692 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data\") pod \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " Oct 14 13:38:32.415947 master-2 kubenswrapper[4762]: I1014 13:38:32.415769 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle\") pod \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " Oct 14 13:38:32.415947 master-2 kubenswrapper[4762]: I1014 13:38:32.415878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2qx\" (UniqueName: \"kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx\") pod \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\" (UID: \"5ea4e416-c6ee-4940-a9f8-2b2265d16336\") " Oct 14 13:38:32.419187 master-2 kubenswrapper[4762]: I1014 13:38:32.418912 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts" (OuterVolumeSpecName: "scripts") pod "5ea4e416-c6ee-4940-a9f8-2b2265d16336" (UID: "5ea4e416-c6ee-4940-a9f8-2b2265d16336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:32.421481 master-2 kubenswrapper[4762]: I1014 13:38:32.421311 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx" (OuterVolumeSpecName: "kube-api-access-mq2qx") pod "5ea4e416-c6ee-4940-a9f8-2b2265d16336" (UID: "5ea4e416-c6ee-4940-a9f8-2b2265d16336"). InnerVolumeSpecName "kube-api-access-mq2qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:32.446241 master-2 kubenswrapper[4762]: I1014 13:38:32.444718 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ea4e416-c6ee-4940-a9f8-2b2265d16336" (UID: "5ea4e416-c6ee-4940-a9f8-2b2265d16336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:32.447086 master-2 kubenswrapper[4762]: I1014 13:38:32.446996 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data" (OuterVolumeSpecName: "config-data") pod "5ea4e416-c6ee-4940-a9f8-2b2265d16336" (UID: "5ea4e416-c6ee-4940-a9f8-2b2265d16336"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:32.518630 master-2 kubenswrapper[4762]: I1014 13:38:32.518550 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:32.518630 master-2 kubenswrapper[4762]: I1014 13:38:32.518619 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:32.518630 master-2 kubenswrapper[4762]: I1014 13:38:32.518633 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea4e416-c6ee-4940-a9f8-2b2265d16336-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:32.518892 master-2 kubenswrapper[4762]: I1014 13:38:32.518650 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2qx\" (UniqueName: \"kubernetes.io/projected/5ea4e416-c6ee-4940-a9f8-2b2265d16336-kube-api-access-mq2qx\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:32.720912 master-2 kubenswrapper[4762]: I1014 13:38:32.720729 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:38:32.751681 master-2 kubenswrapper[4762]: I1014 13:38:32.751611 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:38:32.855648 master-2 kubenswrapper[4762]: I1014 13:38:32.855510 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-99b9d" event={"ID":"5ea4e416-c6ee-4940-a9f8-2b2265d16336","Type":"ContainerDied","Data":"af63048068679994190947a5cd01aeaa22ebfefa4c7d5d56391d2e2f5115d6e2"} Oct 14 13:38:32.855648 master-2 kubenswrapper[4762]: I1014 13:38:32.855600 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af63048068679994190947a5cd01aeaa22ebfefa4c7d5d56391d2e2f5115d6e2" Oct 14 13:38:32.855648 master-2 kubenswrapper[4762]: I1014 13:38:32.855536 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-99b9d" Oct 14 13:38:32.891021 master-2 kubenswrapper[4762]: I1014 13:38:32.890534 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:38:32.968729 master-2 kubenswrapper[4762]: I1014 13:38:32.968658 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:38:32.968729 master-2 kubenswrapper[4762]: I1014 13:38:32.968739 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:38:33.083119 master-2 kubenswrapper[4762]: I1014 13:38:33.082919 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:38:33.083119 master-2 kubenswrapper[4762]: I1014 13:38:33.082996 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:38:33.257682 master-2 kubenswrapper[4762]: I1014 13:38:33.257331 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:38:34.051921 master-2 kubenswrapper[4762]: I1014 13:38:34.051828 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.129.0.158:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:38:34.051921 master-2 kubenswrapper[4762]: I1014 13:38:34.051901 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.129.0.158:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:38:34.166568 master-2 kubenswrapper[4762]: I1014 13:38:34.166479 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.129.0.160:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:38:34.166815 master-2 kubenswrapper[4762]: I1014 13:38:34.166622 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.129.0.160:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 14 13:38:35.025519 master-2 kubenswrapper[4762]: I1014 13:38:35.025428 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:35.026075 master-2 kubenswrapper[4762]: I1014 13:38:35.026016 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="sg-core" containerID="cri-o://d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda" gracePeriod=30 Oct 14 13:38:35.026128 master-2 kubenswrapper[4762]: I1014 13:38:35.025942 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-central-agent" containerID="cri-o://f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75" gracePeriod=30 Oct 14 13:38:35.026188 master-2 kubenswrapper[4762]: I1014 13:38:35.025985 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="proxy-httpd" containerID="cri-o://0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0" gracePeriod=30 Oct 14 13:38:35.026188 master-2 kubenswrapper[4762]: I1014 13:38:35.026011 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-notification-agent" containerID="cri-o://90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5" gracePeriod=30 Oct 14 13:38:35.894672 master-2 kubenswrapper[4762]: I1014 13:38:35.894604 4762 generic.go:334] "Generic (PLEG): container finished" podID="364fb498-e482-4488-b302-8b668bb4fb78" containerID="0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0" exitCode=0 Oct 14 13:38:35.894672 master-2 kubenswrapper[4762]: I1014 13:38:35.894657 4762 generic.go:334] "Generic (PLEG): container finished" podID="364fb498-e482-4488-b302-8b668bb4fb78" containerID="d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda" exitCode=2 Oct 14 13:38:35.894672 master-2 kubenswrapper[4762]: I1014 13:38:35.894678 4762 generic.go:334] "Generic (PLEG): container finished" podID="364fb498-e482-4488-b302-8b668bb4fb78" containerID="f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75" exitCode=0 Oct 14 13:38:35.895665 master-2 kubenswrapper[4762]: I1014 13:38:35.894707 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerDied","Data":"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0"} Oct 14 13:38:35.895665 master-2 kubenswrapper[4762]: I1014 13:38:35.894744 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerDied","Data":"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda"} Oct 14 13:38:35.895665 master-2 kubenswrapper[4762]: I1014 13:38:35.894764 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerDied","Data":"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75"} Oct 14 13:38:36.915072 master-2 kubenswrapper[4762]: I1014 13:38:36.913492 4762 generic.go:334] "Generic (PLEG): container finished" podID="e6bc42a1-2444-47ab-8922-267ae995d2cc" containerID="e9867cecc44d38717730048ad6937f254f9da5ca955e38dc5fa42ae8488d1044" exitCode=0 Oct 14 13:38:36.915072 master-2 kubenswrapper[4762]: I1014 13:38:36.913546 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2knq" event={"ID":"e6bc42a1-2444-47ab-8922-267ae995d2cc","Type":"ContainerDied","Data":"e9867cecc44d38717730048ad6937f254f9da5ca955e38dc5fa42ae8488d1044"} Oct 14 13:38:37.473609 master-2 kubenswrapper[4762]: I1014 13:38:37.473006 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:37.569705 master-2 kubenswrapper[4762]: I1014 13:38:37.569593 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj2md\" (UniqueName: \"kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569705 master-2 kubenswrapper[4762]: I1014 13:38:37.569641 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569974 master-2 kubenswrapper[4762]: I1014 13:38:37.569722 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569974 master-2 kubenswrapper[4762]: I1014 13:38:37.569838 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569974 master-2 kubenswrapper[4762]: I1014 13:38:37.569882 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569974 master-2 kubenswrapper[4762]: I1014 13:38:37.569927 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.569974 master-2 kubenswrapper[4762]: I1014 13:38:37.569942 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts\") pod \"364fb498-e482-4488-b302-8b668bb4fb78\" (UID: \"364fb498-e482-4488-b302-8b668bb4fb78\") " Oct 14 13:38:37.570265 master-2 kubenswrapper[4762]: I1014 13:38:37.570172 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:37.570728 master-2 kubenswrapper[4762]: I1014 13:38:37.570689 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:37.570933 master-2 kubenswrapper[4762]: I1014 13:38:37.570792 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.573341 master-2 kubenswrapper[4762]: I1014 13:38:37.573312 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts" (OuterVolumeSpecName: "scripts") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:37.573535 master-2 kubenswrapper[4762]: I1014 13:38:37.573421 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md" (OuterVolumeSpecName: "kube-api-access-cj2md") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "kube-api-access-cj2md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:37.601792 master-2 kubenswrapper[4762]: I1014 13:38:37.601737 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:37.656440 master-2 kubenswrapper[4762]: I1014 13:38:37.656358 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:37.663457 master-2 kubenswrapper[4762]: I1014 13:38:37.663379 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data" (OuterVolumeSpecName: "config-data") pod "364fb498-e482-4488-b302-8b668bb4fb78" (UID: "364fb498-e482-4488-b302-8b668bb4fb78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:37.673350 master-2 kubenswrapper[4762]: I1014 13:38:37.673282 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.673350 master-2 kubenswrapper[4762]: I1014 13:38:37.673336 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.673350 master-2 kubenswrapper[4762]: I1014 13:38:37.673349 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/364fb498-e482-4488-b302-8b668bb4fb78-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.673506 master-2 kubenswrapper[4762]: I1014 13:38:37.673364 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.673506 master-2 kubenswrapper[4762]: I1014 13:38:37.673380 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj2md\" (UniqueName: \"kubernetes.io/projected/364fb498-e482-4488-b302-8b668bb4fb78-kube-api-access-cj2md\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.673506 master-2 kubenswrapper[4762]: I1014 13:38:37.673396 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/364fb498-e482-4488-b302-8b668bb4fb78-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:37.927578 master-2 kubenswrapper[4762]: I1014 13:38:37.927506 4762 generic.go:334] "Generic (PLEG): container finished" podID="364fb498-e482-4488-b302-8b668bb4fb78" containerID="90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5" exitCode=0 Oct 14 13:38:37.928205 master-2 kubenswrapper[4762]: I1014 13:38:37.927631 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerDied","Data":"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5"} Oct 14 13:38:37.928205 master-2 kubenswrapper[4762]: I1014 13:38:37.927720 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"364fb498-e482-4488-b302-8b668bb4fb78","Type":"ContainerDied","Data":"f967381e3fa964e9c65cc88275d8bf2fbc1dd80bcad5aeaeb93eb4070078215d"} Oct 14 13:38:37.928205 master-2 kubenswrapper[4762]: I1014 13:38:37.927752 4762 scope.go:117] "RemoveContainer" containerID="0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0" Oct 14 13:38:37.928205 master-2 kubenswrapper[4762]: I1014 13:38:37.928076 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:37.958992 master-2 kubenswrapper[4762]: I1014 13:38:37.958822 4762 scope.go:117] "RemoveContainer" containerID="d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda" Oct 14 13:38:37.979700 master-2 kubenswrapper[4762]: I1014 13:38:37.979062 4762 scope.go:117] "RemoveContainer" containerID="90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5" Oct 14 13:38:38.005284 master-2 kubenswrapper[4762]: I1014 13:38:38.005172 4762 scope.go:117] "RemoveContainer" containerID="f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75" Oct 14 13:38:38.024084 master-2 kubenswrapper[4762]: I1014 13:38:38.024011 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:38.027406 master-2 kubenswrapper[4762]: I1014 13:38:38.027358 4762 scope.go:117] "RemoveContainer" containerID="0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: E1014 13:38:38.030359 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0\": container with ID starting with 0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0 not found: ID does not exist" containerID="0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: I1014 13:38:38.030407 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0"} err="failed to get container status \"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0\": rpc error: code = NotFound desc = could not find container \"0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0\": container with ID starting with 0bfd4c7343ab9a855f6f15c30eefa517488377ca4a7350727bb7802ba0641eb0 not found: ID does not exist" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: I1014 13:38:38.030436 4762 scope.go:117] "RemoveContainer" containerID="d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: E1014 13:38:38.031741 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda\": container with ID starting with d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda not found: ID does not exist" containerID="d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: I1014 13:38:38.031802 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda"} err="failed to get container status \"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda\": rpc error: code = NotFound desc = could not find container \"d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda\": container with ID starting with d4017f995a637d5baacc824546ad928b6c6dfa0ef44e3b42f88ca291da34adda not found: ID does not exist" Oct 14 13:38:38.032188 master-2 kubenswrapper[4762]: I1014 13:38:38.031840 4762 scope.go:117] "RemoveContainer" containerID="90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5" Oct 14 13:38:38.032648 master-2 kubenswrapper[4762]: E1014 13:38:38.032624 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5\": container with ID starting with 90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5 not found: ID does not exist" containerID="90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5" Oct 14 13:38:38.032695 master-2 kubenswrapper[4762]: I1014 13:38:38.032654 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5"} err="failed to get container status \"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5\": rpc error: code = NotFound desc = could not find container \"90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5\": container with ID starting with 90944bb017ded1a54628a46aefd1f24715e71d45e563d4af427ce21c50d505e5 not found: ID does not exist" Oct 14 13:38:38.032695 master-2 kubenswrapper[4762]: I1014 13:38:38.032674 4762 scope.go:117] "RemoveContainer" containerID="f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75" Oct 14 13:38:38.040000 master-2 kubenswrapper[4762]: E1014 13:38:38.032991 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75\": container with ID starting with f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75 not found: ID does not exist" containerID="f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75" Oct 14 13:38:38.040000 master-2 kubenswrapper[4762]: I1014 13:38:38.033015 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75"} err="failed to get container status \"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75\": rpc error: code = NotFound desc = could not find container \"f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75\": container with ID starting with f1c6957e3e062d50a2f42dfc4899afb5efded9dd14f3801d9009d4debc099f75 not found: ID does not exist" Oct 14 13:38:38.040000 master-2 kubenswrapper[4762]: I1014 13:38:38.038566 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:38.079123 master-2 kubenswrapper[4762]: I1014 13:38:38.079014 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:38.079577 master-2 kubenswrapper[4762]: E1014 13:38:38.079545 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="sg-core" Oct 14 13:38:38.079577 master-2 kubenswrapper[4762]: I1014 13:38:38.079571 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="sg-core" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: E1014 13:38:38.079602 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="proxy-httpd" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: I1014 13:38:38.079614 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="proxy-httpd" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: E1014 13:38:38.079629 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-central-agent" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: I1014 13:38:38.079638 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-central-agent" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: E1014 13:38:38.079654 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ea4e416-c6ee-4940-a9f8-2b2265d16336" containerName="nova-manage" Oct 14 13:38:38.079667 master-2 kubenswrapper[4762]: I1014 13:38:38.079663 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea4e416-c6ee-4940-a9f8-2b2265d16336" containerName="nova-manage" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: E1014 13:38:38.079687 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e8aee3-10b8-4420-bacc-83d4d9e9e205" containerName="aodh-db-sync" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079696 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e8aee3-10b8-4420-bacc-83d4d9e9e205" containerName="aodh-db-sync" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: E1014 13:38:38.079721 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-notification-agent" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079730 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-notification-agent" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079892 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e8aee3-10b8-4420-bacc-83d4d9e9e205" containerName="aodh-db-sync" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079905 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-central-agent" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079929 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="ceilometer-notification-agent" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079942 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ea4e416-c6ee-4940-a9f8-2b2265d16336" containerName="nova-manage" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079956 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="sg-core" Oct 14 13:38:38.080410 master-2 kubenswrapper[4762]: I1014 13:38:38.079970 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="364fb498-e482-4488-b302-8b668bb4fb78" containerName="proxy-httpd" Oct 14 13:38:38.081768 master-2 kubenswrapper[4762]: I1014 13:38:38.081733 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:38.084200 master-2 kubenswrapper[4762]: I1014 13:38:38.084106 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:38:38.084491 master-2 kubenswrapper[4762]: I1014 13:38:38.084284 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:38:38.133570 master-2 kubenswrapper[4762]: I1014 13:38:38.133506 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:38.184197 master-2 kubenswrapper[4762]: I1014 13:38:38.184005 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.184197 master-2 kubenswrapper[4762]: I1014 13:38:38.184060 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.184197 master-2 kubenswrapper[4762]: I1014 13:38:38.184081 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.184197 master-2 kubenswrapper[4762]: I1014 13:38:38.184171 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.194602 master-2 kubenswrapper[4762]: I1014 13:38:38.193408 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.200595 master-2 kubenswrapper[4762]: I1014 13:38:38.198486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9tw\" (UniqueName: \"kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.200595 master-2 kubenswrapper[4762]: I1014 13:38:38.198675 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300069 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300137 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9tw\" (UniqueName: \"kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300201 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300259 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300280 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300296 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.300477 master-2 kubenswrapper[4762]: I1014 13:38:38.300359 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.301092 master-2 kubenswrapper[4762]: I1014 13:38:38.301048 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.301750 master-2 kubenswrapper[4762]: I1014 13:38:38.301685 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.304115 master-2 kubenswrapper[4762]: I1014 13:38:38.304045 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.304865 master-2 kubenswrapper[4762]: I1014 13:38:38.304823 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.310626 master-2 kubenswrapper[4762]: I1014 13:38:38.310554 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.313632 master-2 kubenswrapper[4762]: I1014 13:38:38.313502 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.331596 master-2 kubenswrapper[4762]: I1014 13:38:38.331531 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9tw\" (UniqueName: \"kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw\") pod \"ceilometer-0\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " pod="openstack/ceilometer-0" Oct 14 13:38:38.409808 master-2 kubenswrapper[4762]: I1014 13:38:38.409739 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:38.413271 master-2 kubenswrapper[4762]: I1014 13:38:38.413232 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:38.504745 master-2 kubenswrapper[4762]: I1014 13:38:38.504669 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts\") pod \"e6bc42a1-2444-47ab-8922-267ae995d2cc\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " Oct 14 13:38:38.504745 master-2 kubenswrapper[4762]: I1014 13:38:38.504751 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xj6d\" (UniqueName: \"kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d\") pod \"e6bc42a1-2444-47ab-8922-267ae995d2cc\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " Oct 14 13:38:38.505223 master-2 kubenswrapper[4762]: I1014 13:38:38.504847 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle\") pod \"e6bc42a1-2444-47ab-8922-267ae995d2cc\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " Oct 14 13:38:38.505223 master-2 kubenswrapper[4762]: I1014 13:38:38.504885 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data\") pod \"e6bc42a1-2444-47ab-8922-267ae995d2cc\" (UID: \"e6bc42a1-2444-47ab-8922-267ae995d2cc\") " Oct 14 13:38:38.508889 master-2 kubenswrapper[4762]: I1014 13:38:38.508828 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts" (OuterVolumeSpecName: "scripts") pod "e6bc42a1-2444-47ab-8922-267ae995d2cc" (UID: "e6bc42a1-2444-47ab-8922-267ae995d2cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:38.509647 master-2 kubenswrapper[4762]: I1014 13:38:38.509596 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d" (OuterVolumeSpecName: "kube-api-access-8xj6d") pod "e6bc42a1-2444-47ab-8922-267ae995d2cc" (UID: "e6bc42a1-2444-47ab-8922-267ae995d2cc"). InnerVolumeSpecName "kube-api-access-8xj6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:38.568323 master-2 kubenswrapper[4762]: I1014 13:38:38.568257 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data" (OuterVolumeSpecName: "config-data") pod "e6bc42a1-2444-47ab-8922-267ae995d2cc" (UID: "e6bc42a1-2444-47ab-8922-267ae995d2cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:38.572349 master-2 kubenswrapper[4762]: I1014 13:38:38.572295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6bc42a1-2444-47ab-8922-267ae995d2cc" (UID: "e6bc42a1-2444-47ab-8922-267ae995d2cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:38.607046 master-2 kubenswrapper[4762]: I1014 13:38:38.606672 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:38.607046 master-2 kubenswrapper[4762]: I1014 13:38:38.606712 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:38.607046 master-2 kubenswrapper[4762]: I1014 13:38:38.606723 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6bc42a1-2444-47ab-8922-267ae995d2cc-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:38.607046 master-2 kubenswrapper[4762]: I1014 13:38:38.606732 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xj6d\" (UniqueName: \"kubernetes.io/projected/e6bc42a1-2444-47ab-8922-267ae995d2cc-kube-api-access-8xj6d\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:38.916516 master-2 kubenswrapper[4762]: I1014 13:38:38.916449 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:38.923076 master-2 kubenswrapper[4762]: W1014 13:38:38.923022 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cca7317_83f8_4c69_b0cb_98d26c22bfab.slice/crio-2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47 WatchSource:0}: Error finding container 2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47: Status 404 returned error can't find the container with id 2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47 Oct 14 13:38:38.938287 master-2 kubenswrapper[4762]: I1014 13:38:38.938227 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerStarted","Data":"2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47"} Oct 14 13:38:38.940387 master-2 kubenswrapper[4762]: I1014 13:38:38.940336 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2knq" event={"ID":"e6bc42a1-2444-47ab-8922-267ae995d2cc","Type":"ContainerDied","Data":"852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70"} Oct 14 13:38:38.940471 master-2 kubenswrapper[4762]: I1014 13:38:38.940395 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="852ae745ecbf2a896257ab82a8dbcbc381e475c351ea3980de4021a7d7496a70" Oct 14 13:38:38.940471 master-2 kubenswrapper[4762]: I1014 13:38:38.940422 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2knq" Oct 14 13:38:39.134726 master-2 kubenswrapper[4762]: I1014 13:38:39.132783 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:38:39.134726 master-2 kubenswrapper[4762]: E1014 13:38:39.133267 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bc42a1-2444-47ab-8922-267ae995d2cc" containerName="nova-cell1-conductor-db-sync" Oct 14 13:38:39.134726 master-2 kubenswrapper[4762]: I1014 13:38:39.133281 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bc42a1-2444-47ab-8922-267ae995d2cc" containerName="nova-cell1-conductor-db-sync" Oct 14 13:38:39.134726 master-2 kubenswrapper[4762]: I1014 13:38:39.133431 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bc42a1-2444-47ab-8922-267ae995d2cc" containerName="nova-cell1-conductor-db-sync" Oct 14 13:38:39.134726 master-2 kubenswrapper[4762]: I1014 13:38:39.134255 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.138372 master-2 kubenswrapper[4762]: I1014 13:38:39.137759 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 14 13:38:39.152662 master-2 kubenswrapper[4762]: I1014 13:38:39.152613 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:38:39.222770 master-2 kubenswrapper[4762]: I1014 13:38:39.222636 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.223039 master-2 kubenswrapper[4762]: I1014 13:38:39.223018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.223204 master-2 kubenswrapper[4762]: I1014 13:38:39.223187 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwh8j\" (UniqueName: \"kubernetes.io/projected/074f1ad1-048a-466b-9b3c-1ae617f5176d-kube-api-access-cwh8j\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.328277 master-2 kubenswrapper[4762]: I1014 13:38:39.328231 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.328576 master-2 kubenswrapper[4762]: I1014 13:38:39.328563 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.328694 master-2 kubenswrapper[4762]: I1014 13:38:39.328683 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwh8j\" (UniqueName: \"kubernetes.io/projected/074f1ad1-048a-466b-9b3c-1ae617f5176d-kube-api-access-cwh8j\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.338428 master-2 kubenswrapper[4762]: I1014 13:38:39.338389 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.345175 master-2 kubenswrapper[4762]: I1014 13:38:39.340503 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074f1ad1-048a-466b-9b3c-1ae617f5176d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.353179 master-2 kubenswrapper[4762]: I1014 13:38:39.352422 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwh8j\" (UniqueName: \"kubernetes.io/projected/074f1ad1-048a-466b-9b3c-1ae617f5176d-kube-api-access-cwh8j\") pod \"nova-cell1-conductor-0\" (UID: \"074f1ad1-048a-466b-9b3c-1ae617f5176d\") " pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.464324 master-2 kubenswrapper[4762]: I1014 13:38:39.464252 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:39.567940 master-2 kubenswrapper[4762]: I1014 13:38:39.567878 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="364fb498-e482-4488-b302-8b668bb4fb78" path="/var/lib/kubelet/pods/364fb498-e482-4488-b302-8b668bb4fb78/volumes" Oct 14 13:38:39.917700 master-2 kubenswrapper[4762]: I1014 13:38:39.917650 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 14 13:38:39.922618 master-2 kubenswrapper[4762]: W1014 13:38:39.922547 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod074f1ad1_048a_466b_9b3c_1ae617f5176d.slice/crio-57725724bce950720e923ca5845b6777a43f79e204d8b3ef500bf58d158525c7 WatchSource:0}: Error finding container 57725724bce950720e923ca5845b6777a43f79e204d8b3ef500bf58d158525c7: Status 404 returned error can't find the container with id 57725724bce950720e923ca5845b6777a43f79e204d8b3ef500bf58d158525c7 Oct 14 13:38:39.953615 master-2 kubenswrapper[4762]: I1014 13:38:39.953570 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"074f1ad1-048a-466b-9b3c-1ae617f5176d","Type":"ContainerStarted","Data":"57725724bce950720e923ca5845b6777a43f79e204d8b3ef500bf58d158525c7"} Oct 14 13:38:39.956682 master-2 kubenswrapper[4762]: I1014 13:38:39.956647 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerStarted","Data":"8c74b9cb4e81e06c7bc3e87e417dff14d054f3fa9608dd3c961dcd984f6606d4"} Oct 14 13:38:40.968421 master-2 kubenswrapper[4762]: I1014 13:38:40.968339 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"074f1ad1-048a-466b-9b3c-1ae617f5176d","Type":"ContainerStarted","Data":"a9d05352c0461ec61252a697e5c0f6c8b122d0c212864ebcab32bc8f380f7e70"} Oct 14 13:38:40.968935 master-2 kubenswrapper[4762]: I1014 13:38:40.968451 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:40.970624 master-2 kubenswrapper[4762]: I1014 13:38:40.970576 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerStarted","Data":"40c06e905ae4f03fbd29a5fd8e06fbeab9259713185bad5de2e65956d5103423"} Oct 14 13:38:41.008791 master-2 kubenswrapper[4762]: I1014 13:38:41.008697 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.008681068 podStartE2EDuration="2.008681068s" podCreationTimestamp="2025-10-14 13:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:38:41.002509485 +0000 UTC m=+1950.246668644" watchObservedRunningTime="2025-10-14 13:38:41.008681068 +0000 UTC m=+1950.252840227" Oct 14 13:38:41.982543 master-2 kubenswrapper[4762]: I1014 13:38:41.982463 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerStarted","Data":"dbdb877b21f62762542d2f5e2033d37870ecfba58c2ff369e4a2ffb0757e4cbd"} Oct 14 13:38:42.055529 master-2 kubenswrapper[4762]: I1014 13:38:42.055468 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:42.973280 master-2 kubenswrapper[4762]: I1014 13:38:42.973214 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:38:42.974297 master-2 kubenswrapper[4762]: I1014 13:38:42.974250 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:38:42.976180 master-2 kubenswrapper[4762]: I1014 13:38:42.976090 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:38:42.980035 master-2 kubenswrapper[4762]: I1014 13:38:42.979978 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:38:42.990191 master-2 kubenswrapper[4762]: I1014 13:38:42.990071 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:38:42.995186 master-2 kubenswrapper[4762]: I1014 13:38:42.995122 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:38:43.086514 master-2 kubenswrapper[4762]: I1014 13:38:43.086417 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:38:43.087226 master-2 kubenswrapper[4762]: I1014 13:38:43.087133 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:38:43.090419 master-2 kubenswrapper[4762]: I1014 13:38:43.090355 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:38:43.441733 master-2 kubenswrapper[4762]: I1014 13:38:43.441673 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:38:43.443136 master-2 kubenswrapper[4762]: I1014 13:38:43.443102 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.447174 master-2 kubenswrapper[4762]: I1014 13:38:43.447130 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 14 13:38:43.447508 master-2 kubenswrapper[4762]: I1014 13:38:43.447484 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 14 13:38:43.457355 master-2 kubenswrapper[4762]: I1014 13:38:43.457280 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:38:43.562021 master-2 kubenswrapper[4762]: I1014 13:38:43.561803 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.562021 master-2 kubenswrapper[4762]: I1014 13:38:43.561975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.562615 master-2 kubenswrapper[4762]: I1014 13:38:43.562040 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9wl\" (UniqueName: \"kubernetes.io/projected/5702de44-2386-4dad-8365-e2130d21adaf-kube-api-access-jl9wl\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.562615 master-2 kubenswrapper[4762]: I1014 13:38:43.562089 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.664018 master-2 kubenswrapper[4762]: I1014 13:38:43.663958 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.664194 master-2 kubenswrapper[4762]: I1014 13:38:43.664097 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.664194 master-2 kubenswrapper[4762]: I1014 13:38:43.664181 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9wl\" (UniqueName: \"kubernetes.io/projected/5702de44-2386-4dad-8365-e2130d21adaf-kube-api-access-jl9wl\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.664290 master-2 kubenswrapper[4762]: I1014 13:38:43.664232 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.668361 master-2 kubenswrapper[4762]: I1014 13:38:43.668304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.669809 master-2 kubenswrapper[4762]: I1014 13:38:43.669778 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.673213 master-2 kubenswrapper[4762]: I1014 13:38:43.673172 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5702de44-2386-4dad-8365-e2130d21adaf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.700952 master-2 kubenswrapper[4762]: I1014 13:38:43.700890 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9wl\" (UniqueName: \"kubernetes.io/projected/5702de44-2386-4dad-8365-e2130d21adaf-kube-api-access-jl9wl\") pod \"kube-state-metrics-0\" (UID: \"5702de44-2386-4dad-8365-e2130d21adaf\") " pod="openstack/kube-state-metrics-0" Oct 14 13:38:43.771592 master-2 kubenswrapper[4762]: I1014 13:38:43.767569 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 14 13:38:44.005191 master-2 kubenswrapper[4762]: I1014 13:38:44.002719 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerStarted","Data":"8516f98c675a5c32a65d2bd44f9940e17737ff631c3fde28736f4f99a9b0c785"} Oct 14 13:38:44.005191 master-2 kubenswrapper[4762]: I1014 13:38:44.003226 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-central-agent" containerID="cri-o://8c74b9cb4e81e06c7bc3e87e417dff14d054f3fa9608dd3c961dcd984f6606d4" gracePeriod=30 Oct 14 13:38:44.005191 master-2 kubenswrapper[4762]: I1014 13:38:44.003347 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="proxy-httpd" containerID="cri-o://8516f98c675a5c32a65d2bd44f9940e17737ff631c3fde28736f4f99a9b0c785" gracePeriod=30 Oct 14 13:38:44.005191 master-2 kubenswrapper[4762]: I1014 13:38:44.003405 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="sg-core" containerID="cri-o://dbdb877b21f62762542d2f5e2033d37870ecfba58c2ff369e4a2ffb0757e4cbd" gracePeriod=30 Oct 14 13:38:44.005191 master-2 kubenswrapper[4762]: I1014 13:38:44.003454 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-notification-agent" containerID="cri-o://40c06e905ae4f03fbd29a5fd8e06fbeab9259713185bad5de2e65956d5103423" gracePeriod=30 Oct 14 13:38:44.006564 master-2 kubenswrapper[4762]: I1014 13:38:44.006515 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:38:44.053948 master-2 kubenswrapper[4762]: I1014 13:38:44.053870 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.3674155190000001 podStartE2EDuration="6.053848784s" podCreationTimestamp="2025-10-14 13:38:38 +0000 UTC" firstStartedPulling="2025-10-14 13:38:38.925293576 +0000 UTC m=+1948.169452745" lastFinishedPulling="2025-10-14 13:38:43.611726851 +0000 UTC m=+1952.855886010" observedRunningTime="2025-10-14 13:38:44.040387865 +0000 UTC m=+1953.284547034" watchObservedRunningTime="2025-10-14 13:38:44.053848784 +0000 UTC m=+1953.298007943" Oct 14 13:38:44.215131 master-2 kubenswrapper[4762]: I1014 13:38:44.215069 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 14 13:38:44.226041 master-2 kubenswrapper[4762]: W1014 13:38:44.225967 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5702de44_2386_4dad_8365_e2130d21adaf.slice/crio-2f8709475f36aaeb9181eafba30db92a3dc3de807e9177123946eb3254bd64b3 WatchSource:0}: Error finding container 2f8709475f36aaeb9181eafba30db92a3dc3de807e9177123946eb3254bd64b3: Status 404 returned error can't find the container with id 2f8709475f36aaeb9181eafba30db92a3dc3de807e9177123946eb3254bd64b3 Oct 14 13:38:45.018542 master-2 kubenswrapper[4762]: I1014 13:38:45.015195 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5702de44-2386-4dad-8365-e2130d21adaf","Type":"ContainerStarted","Data":"2f8709475f36aaeb9181eafba30db92a3dc3de807e9177123946eb3254bd64b3"} Oct 14 13:38:45.019453 master-2 kubenswrapper[4762]: I1014 13:38:45.019079 4762 generic.go:334] "Generic (PLEG): container finished" podID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerID="8516f98c675a5c32a65d2bd44f9940e17737ff631c3fde28736f4f99a9b0c785" exitCode=0 Oct 14 13:38:45.019453 master-2 kubenswrapper[4762]: I1014 13:38:45.019191 4762 generic.go:334] "Generic (PLEG): container finished" podID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerID="dbdb877b21f62762542d2f5e2033d37870ecfba58c2ff369e4a2ffb0757e4cbd" exitCode=2 Oct 14 13:38:45.019453 master-2 kubenswrapper[4762]: I1014 13:38:45.019202 4762 generic.go:334] "Generic (PLEG): container finished" podID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerID="40c06e905ae4f03fbd29a5fd8e06fbeab9259713185bad5de2e65956d5103423" exitCode=0 Oct 14 13:38:45.019743 master-2 kubenswrapper[4762]: I1014 13:38:45.019675 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerDied","Data":"8516f98c675a5c32a65d2bd44f9940e17737ff631c3fde28736f4f99a9b0c785"} Oct 14 13:38:45.019912 master-2 kubenswrapper[4762]: I1014 13:38:45.019891 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerDied","Data":"dbdb877b21f62762542d2f5e2033d37870ecfba58c2ff369e4a2ffb0757e4cbd"} Oct 14 13:38:45.020024 master-2 kubenswrapper[4762]: I1014 13:38:45.020007 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerDied","Data":"40c06e905ae4f03fbd29a5fd8e06fbeab9259713185bad5de2e65956d5103423"} Oct 14 13:38:47.054099 master-2 kubenswrapper[4762]: I1014 13:38:47.054041 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5702de44-2386-4dad-8365-e2130d21adaf","Type":"ContainerStarted","Data":"66aab577e28f275461208c6549e9a64afe963d4192a20f67419e8d4929e9644b"} Oct 14 13:38:47.054862 master-2 kubenswrapper[4762]: I1014 13:38:47.054323 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 14 13:38:47.061993 master-2 kubenswrapper[4762]: I1014 13:38:47.061915 4762 generic.go:334] "Generic (PLEG): container finished" podID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerID="8c74b9cb4e81e06c7bc3e87e417dff14d054f3fa9608dd3c961dcd984f6606d4" exitCode=0 Oct 14 13:38:47.061993 master-2 kubenswrapper[4762]: I1014 13:38:47.061978 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerDied","Data":"8c74b9cb4e81e06c7bc3e87e417dff14d054f3fa9608dd3c961dcd984f6606d4"} Oct 14 13:38:47.062337 master-2 kubenswrapper[4762]: I1014 13:38:47.062019 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cca7317-83f8-4c69-b0cb-98d26c22bfab","Type":"ContainerDied","Data":"2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47"} Oct 14 13:38:47.062337 master-2 kubenswrapper[4762]: I1014 13:38:47.062036 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2657d1e20bfa22b168491561bcda6899c2de006af9b05b25b21029b1f9bbbe47" Oct 14 13:38:47.085510 master-2 kubenswrapper[4762]: I1014 13:38:47.085458 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:47.130587 master-2 kubenswrapper[4762]: I1014 13:38:47.130486 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.599623619 podStartE2EDuration="4.130464551s" podCreationTimestamp="2025-10-14 13:38:43 +0000 UTC" firstStartedPulling="2025-10-14 13:38:44.228329514 +0000 UTC m=+1953.472488683" lastFinishedPulling="2025-10-14 13:38:46.759170456 +0000 UTC m=+1956.003329615" observedRunningTime="2025-10-14 13:38:47.12785127 +0000 UTC m=+1956.372010439" watchObservedRunningTime="2025-10-14 13:38:47.130464551 +0000 UTC m=+1956.374623710" Oct 14 13:38:47.154760 master-2 kubenswrapper[4762]: I1014 13:38:47.154693 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155054 master-2 kubenswrapper[4762]: I1014 13:38:47.154830 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9tw\" (UniqueName: \"kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155054 master-2 kubenswrapper[4762]: I1014 13:38:47.154878 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155054 master-2 kubenswrapper[4762]: I1014 13:38:47.154929 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155054 master-2 kubenswrapper[4762]: I1014 13:38:47.155004 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155331 master-2 kubenswrapper[4762]: I1014 13:38:47.155070 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.155331 master-2 kubenswrapper[4762]: I1014 13:38:47.155121 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts\") pod \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\" (UID: \"1cca7317-83f8-4c69-b0cb-98d26c22bfab\") " Oct 14 13:38:47.156122 master-2 kubenswrapper[4762]: I1014 13:38:47.156082 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:47.157514 master-2 kubenswrapper[4762]: I1014 13:38:47.156880 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.158334 master-2 kubenswrapper[4762]: I1014 13:38:47.156207 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:38:47.160538 master-2 kubenswrapper[4762]: I1014 13:38:47.160482 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts" (OuterVolumeSpecName: "scripts") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:47.161145 master-2 kubenswrapper[4762]: I1014 13:38:47.161075 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw" (OuterVolumeSpecName: "kube-api-access-zf9tw") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "kube-api-access-zf9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:38:47.205977 master-2 kubenswrapper[4762]: I1014 13:38:47.205908 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:47.242112 master-2 kubenswrapper[4762]: I1014 13:38:47.241989 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:47.258813 master-2 kubenswrapper[4762]: I1014 13:38:47.258762 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cca7317-83f8-4c69-b0cb-98d26c22bfab-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.258813 master-2 kubenswrapper[4762]: I1014 13:38:47.258809 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9tw\" (UniqueName: \"kubernetes.io/projected/1cca7317-83f8-4c69-b0cb-98d26c22bfab-kube-api-access-zf9tw\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.259000 master-2 kubenswrapper[4762]: I1014 13:38:47.258825 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.259000 master-2 kubenswrapper[4762]: I1014 13:38:47.258838 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.259000 master-2 kubenswrapper[4762]: I1014 13:38:47.258849 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:47.260819 master-2 kubenswrapper[4762]: I1014 13:38:47.260753 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data" (OuterVolumeSpecName: "config-data") pod "1cca7317-83f8-4c69-b0cb-98d26c22bfab" (UID: "1cca7317-83f8-4c69-b0cb-98d26c22bfab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:38:47.361175 master-2 kubenswrapper[4762]: I1014 13:38:47.361102 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cca7317-83f8-4c69-b0cb-98d26c22bfab-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:38:48.070063 master-2 kubenswrapper[4762]: I1014 13:38:48.070017 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:48.121419 master-2 kubenswrapper[4762]: I1014 13:38:48.121342 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:48.128281 master-2 kubenswrapper[4762]: I1014 13:38:48.128239 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:48.182980 master-2 kubenswrapper[4762]: I1014 13:38:48.182923 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:48.183253 master-2 kubenswrapper[4762]: E1014 13:38:48.183224 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="proxy-httpd" Oct 14 13:38:48.183253 master-2 kubenswrapper[4762]: I1014 13:38:48.183244 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="proxy-httpd" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: E1014 13:38:48.183271 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="sg-core" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183280 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="sg-core" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: E1014 13:38:48.183297 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-central-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183306 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-central-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: E1014 13:38:48.183321 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-notification-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183328 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-notification-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183459 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-notification-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183478 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="proxy-httpd" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183490 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="ceilometer-central-agent" Oct 14 13:38:48.183982 master-2 kubenswrapper[4762]: I1014 13:38:48.183503 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" containerName="sg-core" Oct 14 13:38:48.185168 master-2 kubenswrapper[4762]: I1014 13:38:48.185103 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:48.188345 master-2 kubenswrapper[4762]: I1014 13:38:48.188301 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:38:48.189748 master-2 kubenswrapper[4762]: I1014 13:38:48.189693 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:38:48.189950 master-2 kubenswrapper[4762]: I1014 13:38:48.189908 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:38:48.208029 master-2 kubenswrapper[4762]: I1014 13:38:48.207935 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:48.284312 master-2 kubenswrapper[4762]: I1014 13:38:48.284240 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb58m\" (UniqueName: \"kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284551 master-2 kubenswrapper[4762]: I1014 13:38:48.284464 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284551 master-2 kubenswrapper[4762]: I1014 13:38:48.284506 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284551 master-2 kubenswrapper[4762]: I1014 13:38:48.284533 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284687 master-2 kubenswrapper[4762]: I1014 13:38:48.284572 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284878 master-2 kubenswrapper[4762]: I1014 13:38:48.284815 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.284931 master-2 kubenswrapper[4762]: I1014 13:38:48.284895 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.285013 master-2 kubenswrapper[4762]: I1014 13:38:48.284991 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387594 master-2 kubenswrapper[4762]: I1014 13:38:48.387470 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387594 master-2 kubenswrapper[4762]: I1014 13:38:48.387570 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387844 master-2 kubenswrapper[4762]: I1014 13:38:48.387656 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387844 master-2 kubenswrapper[4762]: I1014 13:38:48.387772 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb58m\" (UniqueName: \"kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387926 master-2 kubenswrapper[4762]: I1014 13:38:48.387897 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.387972 master-2 kubenswrapper[4762]: I1014 13:38:48.387940 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.388007 master-2 kubenswrapper[4762]: I1014 13:38:48.387979 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.388084 master-2 kubenswrapper[4762]: I1014 13:38:48.388048 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.388505 master-2 kubenswrapper[4762]: I1014 13:38:48.388463 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.388713 master-2 kubenswrapper[4762]: I1014 13:38:48.388632 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.391220 master-2 kubenswrapper[4762]: I1014 13:38:48.391186 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.391556 master-2 kubenswrapper[4762]: I1014 13:38:48.391518 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.393332 master-2 kubenswrapper[4762]: I1014 13:38:48.392961 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.393753 master-2 kubenswrapper[4762]: I1014 13:38:48.393719 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.393890 master-2 kubenswrapper[4762]: I1014 13:38:48.393837 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.407902 master-2 kubenswrapper[4762]: I1014 13:38:48.407828 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb58m\" (UniqueName: \"kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m\") pod \"ceilometer-0\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " pod="openstack/ceilometer-0" Oct 14 13:38:48.516193 master-2 kubenswrapper[4762]: I1014 13:38:48.516062 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:38:49.042936 master-2 kubenswrapper[4762]: I1014 13:38:49.042866 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:38:49.056207 master-2 kubenswrapper[4762]: W1014 13:38:49.056106 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74da3ec8_87c1_43a3_95b2_60577c05e565.slice/crio-a2ed4313b234cbd6d4870cf72d6a8f1cb1e09879c347b05e772f35a85ee17c69 WatchSource:0}: Error finding container a2ed4313b234cbd6d4870cf72d6a8f1cb1e09879c347b05e772f35a85ee17c69: Status 404 returned error can't find the container with id a2ed4313b234cbd6d4870cf72d6a8f1cb1e09879c347b05e772f35a85ee17c69 Oct 14 13:38:49.087914 master-2 kubenswrapper[4762]: I1014 13:38:49.087837 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerStarted","Data":"a2ed4313b234cbd6d4870cf72d6a8f1cb1e09879c347b05e772f35a85ee17c69"} Oct 14 13:38:49.494422 master-2 kubenswrapper[4762]: I1014 13:38:49.494372 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 14 13:38:49.567540 master-2 kubenswrapper[4762]: I1014 13:38:49.567485 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cca7317-83f8-4c69-b0cb-98d26c22bfab" path="/var/lib/kubelet/pods/1cca7317-83f8-4c69-b0cb-98d26c22bfab/volumes" Oct 14 13:38:50.099892 master-2 kubenswrapper[4762]: I1014 13:38:50.099820 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerStarted","Data":"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed"} Oct 14 13:38:51.112011 master-2 kubenswrapper[4762]: I1014 13:38:51.111828 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerStarted","Data":"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271"} Oct 14 13:38:51.112011 master-2 kubenswrapper[4762]: I1014 13:38:51.111882 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerStarted","Data":"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e"} Oct 14 13:38:53.133460 master-2 kubenswrapper[4762]: I1014 13:38:53.133369 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerStarted","Data":"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2"} Oct 14 13:38:53.134268 master-2 kubenswrapper[4762]: I1014 13:38:53.133599 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:38:53.288802 master-2 kubenswrapper[4762]: I1014 13:38:53.284497 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.069894113 podStartE2EDuration="5.28447673s" podCreationTimestamp="2025-10-14 13:38:48 +0000 UTC" firstStartedPulling="2025-10-14 13:38:49.063859198 +0000 UTC m=+1958.308018397" lastFinishedPulling="2025-10-14 13:38:52.278441855 +0000 UTC m=+1961.522601014" observedRunningTime="2025-10-14 13:38:53.261976288 +0000 UTC m=+1962.506135487" watchObservedRunningTime="2025-10-14 13:38:53.28447673 +0000 UTC m=+1962.528635889" Oct 14 13:38:53.788126 master-2 kubenswrapper[4762]: I1014 13:38:53.788063 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 14 13:39:06.724037 master-2 kubenswrapper[4762]: I1014 13:39:06.723913 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:06.724684 master-2 kubenswrapper[4762]: I1014 13:39:06.724415 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerName="nova-scheduler-scheduler" containerID="cri-o://cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" gracePeriod=30 Oct 14 13:39:07.724261 master-2 kubenswrapper[4762]: E1014 13:39:07.724063 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:39:07.725713 master-2 kubenswrapper[4762]: E1014 13:39:07.725612 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:39:07.727207 master-2 kubenswrapper[4762]: E1014 13:39:07.727165 4762 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 14 13:39:07.727207 master-2 kubenswrapper[4762]: E1014 13:39:07.727199 4762 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerName="nova-scheduler-scheduler" Oct 14 13:39:09.136212 master-2 kubenswrapper[4762]: I1014 13:39:09.135453 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:09.136212 master-2 kubenswrapper[4762]: I1014 13:39:09.135785 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-central-agent" containerID="cri-o://953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed" gracePeriod=30 Oct 14 13:39:09.136212 master-2 kubenswrapper[4762]: I1014 13:39:09.135939 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="sg-core" containerID="cri-o://cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271" gracePeriod=30 Oct 14 13:39:09.136212 master-2 kubenswrapper[4762]: I1014 13:39:09.135959 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-notification-agent" containerID="cri-o://0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e" gracePeriod=30 Oct 14 13:39:09.136212 master-2 kubenswrapper[4762]: I1014 13:39:09.135993 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="proxy-httpd" containerID="cri-o://4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2" gracePeriod=30 Oct 14 13:39:09.149492 master-2 kubenswrapper[4762]: I1014 13:39:09.147537 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.129.0.166:3000/\": EOF" Oct 14 13:39:09.321961 master-2 kubenswrapper[4762]: I1014 13:39:09.321875 4762 generic.go:334] "Generic (PLEG): container finished" podID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerID="cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271" exitCode=2 Oct 14 13:39:09.321961 master-2 kubenswrapper[4762]: I1014 13:39:09.321947 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerDied","Data":"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271"} Oct 14 13:39:09.863602 master-2 kubenswrapper[4762]: I1014 13:39:09.863571 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:39:09.997562 master-2 kubenswrapper[4762]: I1014 13:39:09.997476 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data\") pod \"734b4493-b9c4-457d-a3d2-9d751679cd45\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " Oct 14 13:39:09.997878 master-2 kubenswrapper[4762]: I1014 13:39:09.997672 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle\") pod \"734b4493-b9c4-457d-a3d2-9d751679cd45\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " Oct 14 13:39:09.997878 master-2 kubenswrapper[4762]: I1014 13:39:09.997819 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zldv9\" (UniqueName: \"kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9\") pod \"734b4493-b9c4-457d-a3d2-9d751679cd45\" (UID: \"734b4493-b9c4-457d-a3d2-9d751679cd45\") " Oct 14 13:39:10.001428 master-2 kubenswrapper[4762]: I1014 13:39:10.001345 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9" (OuterVolumeSpecName: "kube-api-access-zldv9") pod "734b4493-b9c4-457d-a3d2-9d751679cd45" (UID: "734b4493-b9c4-457d-a3d2-9d751679cd45"). InnerVolumeSpecName "kube-api-access-zldv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:10.041294 master-2 kubenswrapper[4762]: I1014 13:39:10.041141 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "734b4493-b9c4-457d-a3d2-9d751679cd45" (UID: "734b4493-b9c4-457d-a3d2-9d751679cd45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:10.052530 master-2 kubenswrapper[4762]: I1014 13:39:10.052487 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data" (OuterVolumeSpecName: "config-data") pod "734b4493-b9c4-457d-a3d2-9d751679cd45" (UID: "734b4493-b9c4-457d-a3d2-9d751679cd45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:10.101035 master-2 kubenswrapper[4762]: I1014 13:39:10.100909 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:10.101341 master-2 kubenswrapper[4762]: I1014 13:39:10.101320 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4493-b9c4-457d-a3d2-9d751679cd45-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:10.101462 master-2 kubenswrapper[4762]: I1014 13:39:10.101443 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zldv9\" (UniqueName: \"kubernetes.io/projected/734b4493-b9c4-457d-a3d2-9d751679cd45-kube-api-access-zldv9\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:10.334245 master-2 kubenswrapper[4762]: I1014 13:39:10.334126 4762 generic.go:334] "Generic (PLEG): container finished" podID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerID="4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2" exitCode=0 Oct 14 13:39:10.334245 master-2 kubenswrapper[4762]: I1014 13:39:10.334194 4762 generic.go:334] "Generic (PLEG): container finished" podID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerID="953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed" exitCode=0 Oct 14 13:39:10.334245 master-2 kubenswrapper[4762]: I1014 13:39:10.334250 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerDied","Data":"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2"} Oct 14 13:39:10.334923 master-2 kubenswrapper[4762]: I1014 13:39:10.334285 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerDied","Data":"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed"} Oct 14 13:39:10.335815 master-2 kubenswrapper[4762]: I1014 13:39:10.335797 4762 generic.go:334] "Generic (PLEG): container finished" podID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" exitCode=0 Oct 14 13:39:10.335911 master-2 kubenswrapper[4762]: I1014 13:39:10.335897 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b4493-b9c4-457d-a3d2-9d751679cd45","Type":"ContainerDied","Data":"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8"} Oct 14 13:39:10.335986 master-2 kubenswrapper[4762]: I1014 13:39:10.335974 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"734b4493-b9c4-457d-a3d2-9d751679cd45","Type":"ContainerDied","Data":"27a8c9c706a539b4ce2e55e7f67be3108d48dbff3e6f0ace74560bb1bcd9f957"} Oct 14 13:39:10.336089 master-2 kubenswrapper[4762]: I1014 13:39:10.336048 4762 scope.go:117] "RemoveContainer" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" Oct 14 13:39:10.336269 master-2 kubenswrapper[4762]: I1014 13:39:10.336250 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:39:10.359277 master-2 kubenswrapper[4762]: I1014 13:39:10.359207 4762 scope.go:117] "RemoveContainer" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" Oct 14 13:39:10.359950 master-2 kubenswrapper[4762]: E1014 13:39:10.359882 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8\": container with ID starting with cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8 not found: ID does not exist" containerID="cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8" Oct 14 13:39:10.360050 master-2 kubenswrapper[4762]: I1014 13:39:10.360005 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8"} err="failed to get container status \"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8\": rpc error: code = NotFound desc = could not find container \"cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8\": container with ID starting with cfdb64ed1e3c2b65b2aab169475a3176355a9f3b82ef697ed384382bc4f9b8d8 not found: ID does not exist" Oct 14 13:39:11.384072 master-2 kubenswrapper[4762]: I1014 13:39:11.383658 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:11.422464 master-2 kubenswrapper[4762]: I1014 13:39:11.422359 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:11.514440 master-2 kubenswrapper[4762]: I1014 13:39:11.514350 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:11.515688 master-2 kubenswrapper[4762]: E1014 13:39:11.515667 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerName="nova-scheduler-scheduler" Oct 14 13:39:11.515800 master-2 kubenswrapper[4762]: I1014 13:39:11.515789 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerName="nova-scheduler-scheduler" Oct 14 13:39:11.516208 master-2 kubenswrapper[4762]: I1014 13:39:11.516194 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" containerName="nova-scheduler-scheduler" Oct 14 13:39:11.517174 master-2 kubenswrapper[4762]: I1014 13:39:11.517138 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:39:11.519609 master-2 kubenswrapper[4762]: I1014 13:39:11.519574 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:39:11.532945 master-2 kubenswrapper[4762]: I1014 13:39:11.532907 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.533236 master-2 kubenswrapper[4762]: I1014 13:39:11.533217 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthjz\" (UniqueName: \"kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.533326 master-2 kubenswrapper[4762]: I1014 13:39:11.533313 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.561072 master-2 kubenswrapper[4762]: I1014 13:39:11.560845 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734b4493-b9c4-457d-a3d2-9d751679cd45" path="/var/lib/kubelet/pods/734b4493-b9c4-457d-a3d2-9d751679cd45/volumes" Oct 14 13:39:11.562821 master-2 kubenswrapper[4762]: I1014 13:39:11.561925 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:11.635638 master-2 kubenswrapper[4762]: I1014 13:39:11.634764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.635638 master-2 kubenswrapper[4762]: I1014 13:39:11.634839 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthjz\" (UniqueName: \"kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.635638 master-2 kubenswrapper[4762]: I1014 13:39:11.634869 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.638784 master-2 kubenswrapper[4762]: I1014 13:39:11.638416 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.638784 master-2 kubenswrapper[4762]: I1014 13:39:11.638749 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.667213 master-2 kubenswrapper[4762]: I1014 13:39:11.667144 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthjz\" (UniqueName: \"kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz\") pod \"nova-scheduler-0\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " pod="openstack/nova-scheduler-0" Oct 14 13:39:11.843416 master-2 kubenswrapper[4762]: I1014 13:39:11.843373 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:39:11.923504 master-2 kubenswrapper[4762]: I1014 13:39:11.923441 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:39:11.945352 master-2 kubenswrapper[4762]: I1014 13:39:11.945118 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb58m\" (UniqueName: \"kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.945994 master-2 kubenswrapper[4762]: I1014 13:39:11.945296 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.947876 master-2 kubenswrapper[4762]: I1014 13:39:11.946503 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948444 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948500 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948556 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948658 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948713 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd\") pod \"74da3ec8-87c1-43a3-95b2-60577c05e565\" (UID: \"74da3ec8-87c1-43a3-95b2-60577c05e565\") " Oct 14 13:39:11.948837 master-2 kubenswrapper[4762]: I1014 13:39:11.948208 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m" (OuterVolumeSpecName: "kube-api-access-xb58m") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "kube-api-access-xb58m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:11.954482 master-2 kubenswrapper[4762]: I1014 13:39:11.952954 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts" (OuterVolumeSpecName: "scripts") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:11.955375 master-2 kubenswrapper[4762]: I1014 13:39:11.955286 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:39:11.956230 master-2 kubenswrapper[4762]: I1014 13:39:11.955705 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:39:11.957857 master-2 kubenswrapper[4762]: I1014 13:39:11.957745 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:11.957857 master-2 kubenswrapper[4762]: I1014 13:39:11.957789 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74da3ec8-87c1-43a3-95b2-60577c05e565-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:11.957857 master-2 kubenswrapper[4762]: I1014 13:39:11.957810 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb58m\" (UniqueName: \"kubernetes.io/projected/74da3ec8-87c1-43a3-95b2-60577c05e565-kube-api-access-xb58m\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:11.957857 master-2 kubenswrapper[4762]: I1014 13:39:11.957828 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:11.978111 master-2 kubenswrapper[4762]: I1014 13:39:11.978051 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:12.017715 master-2 kubenswrapper[4762]: I1014 13:39:12.017615 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:12.043462 master-2 kubenswrapper[4762]: I1014 13:39:12.043252 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:12.059878 master-2 kubenswrapper[4762]: I1014 13:39:12.059772 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:12.059878 master-2 kubenswrapper[4762]: I1014 13:39:12.059822 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:12.060130 master-2 kubenswrapper[4762]: I1014 13:39:12.059974 4762 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-ceilometer-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:12.089057 master-2 kubenswrapper[4762]: I1014 13:39:12.088892 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data" (OuterVolumeSpecName: "config-data") pod "74da3ec8-87c1-43a3-95b2-60577c05e565" (UID: "74da3ec8-87c1-43a3-95b2-60577c05e565"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:12.162422 master-2 kubenswrapper[4762]: I1014 13:39:12.162347 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74da3ec8-87c1-43a3-95b2-60577c05e565-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:12.317413 master-2 kubenswrapper[4762]: I1014 13:39:12.317231 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:39:12.318903 master-2 kubenswrapper[4762]: W1014 13:39:12.318840 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf597da64_c1c3_4bf5_88e2_25725c313ea9.slice/crio-9863b959117c17338c1951708201e287fa004a910d5bc2c209fef32ce40f7740 WatchSource:0}: Error finding container 9863b959117c17338c1951708201e287fa004a910d5bc2c209fef32ce40f7740: Status 404 returned error can't find the container with id 9863b959117c17338c1951708201e287fa004a910d5bc2c209fef32ce40f7740 Oct 14 13:39:12.358845 master-2 kubenswrapper[4762]: I1014 13:39:12.358791 4762 generic.go:334] "Generic (PLEG): container finished" podID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerID="0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e" exitCode=0 Oct 14 13:39:12.359222 master-2 kubenswrapper[4762]: I1014 13:39:12.358893 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerDied","Data":"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e"} Oct 14 13:39:12.359222 master-2 kubenswrapper[4762]: I1014 13:39:12.358978 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:39:12.359222 master-2 kubenswrapper[4762]: I1014 13:39:12.359188 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74da3ec8-87c1-43a3-95b2-60577c05e565","Type":"ContainerDied","Data":"a2ed4313b234cbd6d4870cf72d6a8f1cb1e09879c347b05e772f35a85ee17c69"} Oct 14 13:39:12.359436 master-2 kubenswrapper[4762]: I1014 13:39:12.359284 4762 scope.go:117] "RemoveContainer" containerID="4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2" Oct 14 13:39:12.361290 master-2 kubenswrapper[4762]: I1014 13:39:12.361242 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f597da64-c1c3-4bf5-88e2-25725c313ea9","Type":"ContainerStarted","Data":"9863b959117c17338c1951708201e287fa004a910d5bc2c209fef32ce40f7740"} Oct 14 13:39:12.393061 master-2 kubenswrapper[4762]: I1014 13:39:12.392622 4762 scope.go:117] "RemoveContainer" containerID="cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271" Oct 14 13:39:12.408038 master-2 kubenswrapper[4762]: I1014 13:39:12.407331 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:12.414675 master-2 kubenswrapper[4762]: I1014 13:39:12.414527 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:12.426371 master-2 kubenswrapper[4762]: I1014 13:39:12.426326 4762 scope.go:117] "RemoveContainer" containerID="0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446397 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: E1014 13:39:12.446727 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-notification-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446742 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-notification-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: E1014 13:39:12.446754 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="proxy-httpd" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446763 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="proxy-httpd" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: E1014 13:39:12.446795 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-central-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446803 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-central-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: E1014 13:39:12.446812 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="sg-core" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446819 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="sg-core" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446966 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="sg-core" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.446992 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="proxy-httpd" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.447002 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-notification-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.447013 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" containerName="ceilometer-central-agent" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.448716 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.454076 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.454782 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:39:12.456961 master-2 kubenswrapper[4762]: I1014 13:39:12.455034 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:39:12.468635 master-2 kubenswrapper[4762]: I1014 13:39:12.468387 4762 scope.go:117] "RemoveContainer" containerID="953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed" Oct 14 13:39:12.468635 master-2 kubenswrapper[4762]: I1014 13:39:12.468530 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:12.538416 master-2 kubenswrapper[4762]: I1014 13:39:12.538368 4762 scope.go:117] "RemoveContainer" containerID="4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2" Oct 14 13:39:12.538975 master-2 kubenswrapper[4762]: E1014 13:39:12.538952 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2\": container with ID starting with 4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2 not found: ID does not exist" containerID="4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2" Oct 14 13:39:12.539229 master-2 kubenswrapper[4762]: I1014 13:39:12.538985 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2"} err="failed to get container status \"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2\": rpc error: code = NotFound desc = could not find container \"4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2\": container with ID starting with 4ec727fe579277d79ac2698748057f35c7d9058d3df86ba87d8712135963d1f2 not found: ID does not exist" Oct 14 13:39:12.539229 master-2 kubenswrapper[4762]: I1014 13:39:12.539008 4762 scope.go:117] "RemoveContainer" containerID="cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271" Oct 14 13:39:12.539610 master-2 kubenswrapper[4762]: E1014 13:39:12.539573 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271\": container with ID starting with cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271 not found: ID does not exist" containerID="cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271" Oct 14 13:39:12.539662 master-2 kubenswrapper[4762]: I1014 13:39:12.539615 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271"} err="failed to get container status \"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271\": rpc error: code = NotFound desc = could not find container \"cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271\": container with ID starting with cd373ad6e18433dbc4eecce30109f39d4cfb9171d5bc77f3472af776ce35a271 not found: ID does not exist" Oct 14 13:39:12.539662 master-2 kubenswrapper[4762]: I1014 13:39:12.539642 4762 scope.go:117] "RemoveContainer" containerID="0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e" Oct 14 13:39:12.541537 master-2 kubenswrapper[4762]: E1014 13:39:12.541502 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e\": container with ID starting with 0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e not found: ID does not exist" containerID="0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e" Oct 14 13:39:12.541977 master-2 kubenswrapper[4762]: I1014 13:39:12.541541 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e"} err="failed to get container status \"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e\": rpc error: code = NotFound desc = could not find container \"0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e\": container with ID starting with 0317d6ef00d39025096e8667bbfffbcf50450081982dd7552c13ba446e81bc3e not found: ID does not exist" Oct 14 13:39:12.541977 master-2 kubenswrapper[4762]: I1014 13:39:12.541557 4762 scope.go:117] "RemoveContainer" containerID="953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed" Oct 14 13:39:12.542058 master-2 kubenswrapper[4762]: E1014 13:39:12.542024 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed\": container with ID starting with 953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed not found: ID does not exist" containerID="953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed" Oct 14 13:39:12.542095 master-2 kubenswrapper[4762]: I1014 13:39:12.542054 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed"} err="failed to get container status \"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed\": rpc error: code = NotFound desc = could not find container \"953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed\": container with ID starting with 953e64de362c127b9b93a6fe8e995bf42eeee3232945aad34ad19c28a5a7e4ed not found: ID does not exist" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.571772 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.571832 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tr8b\" (UniqueName: \"kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.571906 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.571953 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.571975 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.572010 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.572061 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.573145 master-2 kubenswrapper[4762]: I1014 13:39:12.572090 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674316 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674413 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674444 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674481 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674494 master-2 kubenswrapper[4762]: I1014 13:39:12.674507 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674966 master-2 kubenswrapper[4762]: I1014 13:39:12.674570 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tr8b\" (UniqueName: \"kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.674966 master-2 kubenswrapper[4762]: I1014 13:39:12.674591 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.675119 master-2 kubenswrapper[4762]: I1014 13:39:12.675082 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.675805 master-2 kubenswrapper[4762]: I1014 13:39:12.675773 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.679410 master-2 kubenswrapper[4762]: I1014 13:39:12.679366 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.681021 master-2 kubenswrapper[4762]: I1014 13:39:12.680521 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.681021 master-2 kubenswrapper[4762]: I1014 13:39:12.680578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.681704 master-2 kubenswrapper[4762]: I1014 13:39:12.681673 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.681875 master-2 kubenswrapper[4762]: I1014 13:39:12.681805 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.702963 master-2 kubenswrapper[4762]: I1014 13:39:12.702902 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tr8b\" (UniqueName: \"kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b\") pod \"ceilometer-0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " pod="openstack/ceilometer-0" Oct 14 13:39:12.845719 master-2 kubenswrapper[4762]: I1014 13:39:12.845653 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:39:13.338045 master-2 kubenswrapper[4762]: I1014 13:39:13.337982 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:39:13.338586 master-2 kubenswrapper[4762]: W1014 13:39:13.338529 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6880510a_48a6_48f8_b644_4fd24cff01a0.slice/crio-aa7c5cb00bffa5a1bb9817845ed3788312d478f767b1a42e192bf7b416738303 WatchSource:0}: Error finding container aa7c5cb00bffa5a1bb9817845ed3788312d478f767b1a42e192bf7b416738303: Status 404 returned error can't find the container with id aa7c5cb00bffa5a1bb9817845ed3788312d478f767b1a42e192bf7b416738303 Oct 14 13:39:13.341582 master-2 kubenswrapper[4762]: I1014 13:39:13.341547 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:39:13.371804 master-2 kubenswrapper[4762]: I1014 13:39:13.371714 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerStarted","Data":"aa7c5cb00bffa5a1bb9817845ed3788312d478f767b1a42e192bf7b416738303"} Oct 14 13:39:13.374727 master-2 kubenswrapper[4762]: I1014 13:39:13.374662 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f597da64-c1c3-4bf5-88e2-25725c313ea9","Type":"ContainerStarted","Data":"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0"} Oct 14 13:39:13.413437 master-2 kubenswrapper[4762]: I1014 13:39:13.413343 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.41331935 podStartE2EDuration="2.41331935s" podCreationTimestamp="2025-10-14 13:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:39:13.405107194 +0000 UTC m=+1982.649266363" watchObservedRunningTime="2025-10-14 13:39:13.41331935 +0000 UTC m=+1982.657478519" Oct 14 13:39:13.563016 master-2 kubenswrapper[4762]: I1014 13:39:13.562938 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74da3ec8-87c1-43a3-95b2-60577c05e565" path="/var/lib/kubelet/pods/74da3ec8-87c1-43a3-95b2-60577c05e565/volumes" Oct 14 13:39:14.387418 master-2 kubenswrapper[4762]: I1014 13:39:14.387230 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerStarted","Data":"b8ed7429b07aa22e78df15760985b2cdce61bba49580b6d98c5fd63278867083"} Oct 14 13:39:15.402326 master-2 kubenswrapper[4762]: I1014 13:39:15.402226 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerStarted","Data":"a9e8d6edd96be388c07a9243a3bce483419cf5055c80a43b1b903268c6551d0c"} Oct 14 13:39:16.844071 master-2 kubenswrapper[4762]: I1014 13:39:16.843997 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:39:17.431432 master-2 kubenswrapper[4762]: I1014 13:39:17.431341 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerStarted","Data":"c6171a284684ba32a220ddade705f6e0c440c7b6ece081b2329d94a38bc6ee15"} Oct 14 13:39:18.444737 master-2 kubenswrapper[4762]: I1014 13:39:18.444665 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerStarted","Data":"1b5c20d17fda4ad3770b4104fae746eb568bd2a6efe718d901b0d3d3809fa651"} Oct 14 13:39:18.446114 master-2 kubenswrapper[4762]: I1014 13:39:18.444865 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:39:18.485439 master-2 kubenswrapper[4762]: I1014 13:39:18.484963 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.69301882 podStartE2EDuration="6.484938013s" podCreationTimestamp="2025-10-14 13:39:12 +0000 UTC" firstStartedPulling="2025-10-14 13:39:13.341406018 +0000 UTC m=+1982.585565197" lastFinishedPulling="2025-10-14 13:39:18.133325231 +0000 UTC m=+1987.377484390" observedRunningTime="2025-10-14 13:39:18.47744799 +0000 UTC m=+1987.721607169" watchObservedRunningTime="2025-10-14 13:39:18.484938013 +0000 UTC m=+1987.729097172" Oct 14 13:39:21.844334 master-2 kubenswrapper[4762]: I1014 13:39:21.844263 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:39:21.882037 master-2 kubenswrapper[4762]: I1014 13:39:21.881959 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:39:22.577690 master-2 kubenswrapper[4762]: I1014 13:39:22.575630 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:39:25.006273 master-2 kubenswrapper[4762]: I1014 13:39:25.006107 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:25.006906 master-2 kubenswrapper[4762]: I1014 13:39:25.006604 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" containerID="cri-o://dc764dd0b9815f8afba97ac6c1fa5f82a8f91bdd751d70c869bf014f9115ce83" gracePeriod=30 Oct 14 13:39:25.007392 master-2 kubenswrapper[4762]: I1014 13:39:25.007342 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" containerID="cri-o://37f58c4430902f7cc815a2535a71deb9ecb0506e716fc8179301f3bac2def25f" gracePeriod=30 Oct 14 13:39:25.576078 master-2 kubenswrapper[4762]: I1014 13:39:25.575992 4762 generic.go:334] "Generic (PLEG): container finished" podID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerID="dc764dd0b9815f8afba97ac6c1fa5f82a8f91bdd751d70c869bf014f9115ce83" exitCode=143 Oct 14 13:39:25.576078 master-2 kubenswrapper[4762]: I1014 13:39:25.576045 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerDied","Data":"dc764dd0b9815f8afba97ac6c1fa5f82a8f91bdd751d70c869bf014f9115ce83"} Oct 14 13:39:26.247328 master-2 kubenswrapper[4762]: I1014 13:39:26.247251 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ffpkk"] Oct 14 13:39:26.249452 master-2 kubenswrapper[4762]: I1014 13:39:26.249397 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.253106 master-2 kubenswrapper[4762]: I1014 13:39:26.253050 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 14 13:39:26.253332 master-2 kubenswrapper[4762]: I1014 13:39:26.253204 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 14 13:39:26.265886 master-2 kubenswrapper[4762]: I1014 13:39:26.265806 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-v24fq"] Oct 14 13:39:26.267177 master-2 kubenswrapper[4762]: I1014 13:39:26.267141 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.274223 master-2 kubenswrapper[4762]: I1014 13:39:26.274137 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ffpkk"] Oct 14 13:39:26.288065 master-2 kubenswrapper[4762]: I1014 13:39:26.287996 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-v24fq"] Oct 14 13:39:26.426416 master-2 kubenswrapper[4762]: I1014 13:39:26.426359 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4plx7\" (UniqueName: \"kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.426416 master-2 kubenswrapper[4762]: I1014 13:39:26.426418 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426446 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426486 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426504 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426557 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426604 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.426764 master-2 kubenswrapper[4762]: I1014 13:39:26.426623 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nh6f\" (UniqueName: \"kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.529357 master-2 kubenswrapper[4762]: I1014 13:39:26.529152 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.529357 master-2 kubenswrapper[4762]: I1014 13:39:26.529226 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.529357 master-2 kubenswrapper[4762]: I1014 13:39:26.529312 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.529685 master-2 kubenswrapper[4762]: I1014 13:39:26.529380 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.529685 master-2 kubenswrapper[4762]: I1014 13:39:26.529412 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nh6f\" (UniqueName: \"kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.529685 master-2 kubenswrapper[4762]: I1014 13:39:26.529561 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4plx7\" (UniqueName: \"kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.529685 master-2 kubenswrapper[4762]: I1014 13:39:26.529613 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.529685 master-2 kubenswrapper[4762]: I1014 13:39:26.529642 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.534417 master-2 kubenswrapper[4762]: I1014 13:39:26.534375 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.534817 master-2 kubenswrapper[4762]: I1014 13:39:26.534767 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.535032 master-2 kubenswrapper[4762]: I1014 13:39:26.534998 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.536002 master-2 kubenswrapper[4762]: I1014 13:39:26.535866 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.536356 master-2 kubenswrapper[4762]: I1014 13:39:26.536318 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.538130 master-2 kubenswrapper[4762]: I1014 13:39:26.538065 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.554374 master-2 kubenswrapper[4762]: I1014 13:39:26.554304 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nh6f\" (UniqueName: \"kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f\") pod \"nova-cell1-cell-mapping-ffpkk\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.555055 master-2 kubenswrapper[4762]: I1014 13:39:26.555006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4plx7\" (UniqueName: \"kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7\") pod \"nova-cell1-host-discover-v24fq\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:26.572071 master-2 kubenswrapper[4762]: I1014 13:39:26.572004 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:26.608908 master-2 kubenswrapper[4762]: I1014 13:39:26.608839 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:27.098556 master-2 kubenswrapper[4762]: I1014 13:39:27.098423 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ffpkk"] Oct 14 13:39:27.103760 master-2 kubenswrapper[4762]: W1014 13:39:27.103652 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb2b1f8b_432d_4e7d_a538_28068e9e0bb6.slice/crio-1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5 WatchSource:0}: Error finding container 1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5: Status 404 returned error can't find the container with id 1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5 Oct 14 13:39:27.170040 master-2 kubenswrapper[4762]: W1014 13:39:27.169985 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d692e8_e19a_475b_bc4e_f22508073ffa.slice/crio-cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804 WatchSource:0}: Error finding container cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804: Status 404 returned error can't find the container with id cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804 Oct 14 13:39:27.189630 master-2 kubenswrapper[4762]: I1014 13:39:27.189190 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-v24fq"] Oct 14 13:39:27.628186 master-2 kubenswrapper[4762]: I1014 13:39:27.624085 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ffpkk" event={"ID":"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6","Type":"ContainerStarted","Data":"827df6fbff6c4750d82e51a1950468ed649de141480999ca39cea24a5908f5a0"} Oct 14 13:39:27.628186 master-2 kubenswrapper[4762]: I1014 13:39:27.624169 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ffpkk" event={"ID":"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6","Type":"ContainerStarted","Data":"1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5"} Oct 14 13:39:27.628806 master-2 kubenswrapper[4762]: I1014 13:39:27.628359 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-v24fq" event={"ID":"25d692e8-e19a-475b-bc4e-f22508073ffa","Type":"ContainerStarted","Data":"cf540aae01ce2e4d6adf94d12e78961660e6e278cfd2d2b091b4dd99a247a131"} Oct 14 13:39:27.628806 master-2 kubenswrapper[4762]: I1014 13:39:27.628407 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-v24fq" event={"ID":"25d692e8-e19a-475b-bc4e-f22508073ffa","Type":"ContainerStarted","Data":"cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804"} Oct 14 13:39:27.655889 master-2 kubenswrapper[4762]: I1014 13:39:27.655632 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ffpkk" podStartSLOduration=1.6556067890000001 podStartE2EDuration="1.655606789s" podCreationTimestamp="2025-10-14 13:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:39:27.646485775 +0000 UTC m=+1996.890644944" watchObservedRunningTime="2025-10-14 13:39:27.655606789 +0000 UTC m=+1996.899765968" Oct 14 13:39:27.675112 master-2 kubenswrapper[4762]: I1014 13:39:27.674832 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-v24fq" podStartSLOduration=1.674815428 podStartE2EDuration="1.674815428s" podCreationTimestamp="2025-10-14 13:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:39:27.669513133 +0000 UTC m=+1996.913672302" watchObservedRunningTime="2025-10-14 13:39:27.674815428 +0000 UTC m=+1996.918974587" Oct 14 13:39:27.942383 master-2 kubenswrapper[4762]: I1014 13:39:27.942053 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:39:27.942677 master-2 kubenswrapper[4762]: I1014 13:39:27.942389 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="dnsmasq-dns" containerID="cri-o://e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9" gracePeriod=10 Oct 14 13:39:28.151555 master-2 kubenswrapper[4762]: I1014 13:39:28.151490 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.129.0.160:8775/\": read tcp 10.129.0.2:53650->10.129.0.160:8775: read: connection reset by peer" Oct 14 13:39:28.151740 master-2 kubenswrapper[4762]: I1014 13:39:28.151558 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.129.0.160:8775/\": read tcp 10.129.0.2:53640->10.129.0.160:8775: read: connection reset by peer" Oct 14 13:39:28.641242 master-2 kubenswrapper[4762]: I1014 13:39:28.641174 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:39:28.641877 master-2 kubenswrapper[4762]: I1014 13:39:28.641593 4762 generic.go:334] "Generic (PLEG): container finished" podID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerID="37f58c4430902f7cc815a2535a71deb9ecb0506e716fc8179301f3bac2def25f" exitCode=0 Oct 14 13:39:28.641877 master-2 kubenswrapper[4762]: I1014 13:39:28.641646 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerDied","Data":"37f58c4430902f7cc815a2535a71deb9ecb0506e716fc8179301f3bac2def25f"} Oct 14 13:39:28.643689 master-2 kubenswrapper[4762]: I1014 13:39:28.643652 4762 generic.go:334] "Generic (PLEG): container finished" podID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerID="e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9" exitCode=0 Oct 14 13:39:28.644556 master-2 kubenswrapper[4762]: I1014 13:39:28.644520 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" Oct 14 13:39:28.644756 master-2 kubenswrapper[4762]: I1014 13:39:28.644721 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" event={"ID":"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a","Type":"ContainerDied","Data":"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9"} Oct 14 13:39:28.644756 master-2 kubenswrapper[4762]: I1014 13:39:28.644752 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" event={"ID":"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a","Type":"ContainerDied","Data":"3a822dde7cff55ffa8d510ac7b5544a360075a584e1aebb4e5037c12501f063b"} Oct 14 13:39:28.644885 master-2 kubenswrapper[4762]: I1014 13:39:28.644772 4762 scope.go:117] "RemoveContainer" containerID="e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9" Oct 14 13:39:28.665977 master-2 kubenswrapper[4762]: I1014 13:39:28.665932 4762 scope.go:117] "RemoveContainer" containerID="66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1" Oct 14 13:39:28.775804 master-2 kubenswrapper[4762]: I1014 13:39:28.775742 4762 scope.go:117] "RemoveContainer" containerID="e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9" Oct 14 13:39:28.776441 master-2 kubenswrapper[4762]: E1014 13:39:28.776377 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9\": container with ID starting with e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9 not found: ID does not exist" containerID="e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9" Oct 14 13:39:28.776509 master-2 kubenswrapper[4762]: I1014 13:39:28.776448 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9"} err="failed to get container status \"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9\": rpc error: code = NotFound desc = could not find container \"e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9\": container with ID starting with e649a3fec1dc85953f46e25f000d9eab0e7648dd98e9d3e9e8accbeaefab5ce9 not found: ID does not exist" Oct 14 13:39:28.776509 master-2 kubenswrapper[4762]: I1014 13:39:28.776485 4762 scope.go:117] "RemoveContainer" containerID="66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1" Oct 14 13:39:28.776939 master-2 kubenswrapper[4762]: E1014 13:39:28.776902 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1\": container with ID starting with 66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1 not found: ID does not exist" containerID="66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1" Oct 14 13:39:28.776995 master-2 kubenswrapper[4762]: I1014 13:39:28.776934 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1"} err="failed to get container status \"66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1\": rpc error: code = NotFound desc = could not find container \"66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1\": container with ID starting with 66ffe6d4008981f5a6d3d682e9122f4a407083db4cf80d404e86dedf6b914ad1 not found: ID does not exist" Oct 14 13:39:28.800648 master-2 kubenswrapper[4762]: I1014 13:39:28.800586 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.800746 master-2 kubenswrapper[4762]: I1014 13:39:28.800675 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.800746 master-2 kubenswrapper[4762]: I1014 13:39:28.800715 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h78r5\" (UniqueName: \"kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.800893 master-2 kubenswrapper[4762]: I1014 13:39:28.800852 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.800954 master-2 kubenswrapper[4762]: I1014 13:39:28.800898 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.800954 master-2 kubenswrapper[4762]: I1014 13:39:28.800922 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb\") pod \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\" (UID: \"7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a\") " Oct 14 13:39:28.805135 master-2 kubenswrapper[4762]: I1014 13:39:28.805035 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5" (OuterVolumeSpecName: "kube-api-access-h78r5") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "kube-api-access-h78r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:28.809116 master-2 kubenswrapper[4762]: I1014 13:39:28.809064 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:39:28.850911 master-2 kubenswrapper[4762]: I1014 13:39:28.850830 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:39:28.853334 master-2 kubenswrapper[4762]: I1014 13:39:28.853017 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:39:28.859599 master-2 kubenswrapper[4762]: I1014 13:39:28.859494 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:39:28.859811 master-2 kubenswrapper[4762]: I1014 13:39:28.859501 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:39:28.875824 master-2 kubenswrapper[4762]: I1014 13:39:28.875543 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config" (OuterVolumeSpecName: "config") pod "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" (UID: "7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903021 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data\") pod \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903191 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle\") pod \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903261 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb65w\" (UniqueName: \"kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w\") pod \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903319 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs\") pod \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\" (UID: \"b7a04057-57ba-4fce-9a2a-f09e7333ddd9\") " Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903656 4762 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-swift-storage-0\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903668 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-nb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903679 4762 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-ovsdbserver-sb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903688 4762 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903696 4762 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-dns-svc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.903705 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h78r5\" (UniqueName: \"kubernetes.io/projected/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a-kube-api-access-h78r5\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:28.906036 master-2 kubenswrapper[4762]: I1014 13:39:28.904084 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs" (OuterVolumeSpecName: "logs") pod "b7a04057-57ba-4fce-9a2a-f09e7333ddd9" (UID: "b7a04057-57ba-4fce-9a2a-f09e7333ddd9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:39:28.907194 master-2 kubenswrapper[4762]: I1014 13:39:28.907116 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w" (OuterVolumeSpecName: "kube-api-access-lb65w") pod "b7a04057-57ba-4fce-9a2a-f09e7333ddd9" (UID: "b7a04057-57ba-4fce-9a2a-f09e7333ddd9"). InnerVolumeSpecName "kube-api-access-lb65w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:28.924459 master-2 kubenswrapper[4762]: I1014 13:39:28.924382 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data" (OuterVolumeSpecName: "config-data") pod "b7a04057-57ba-4fce-9a2a-f09e7333ddd9" (UID: "b7a04057-57ba-4fce-9a2a-f09e7333ddd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:28.924810 master-2 kubenswrapper[4762]: I1014 13:39:28.924727 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7a04057-57ba-4fce-9a2a-f09e7333ddd9" (UID: "b7a04057-57ba-4fce-9a2a-f09e7333ddd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:29.014561 master-2 kubenswrapper[4762]: I1014 13:39:29.014480 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:29.014561 master-2 kubenswrapper[4762]: I1014 13:39:29.014533 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb65w\" (UniqueName: \"kubernetes.io/projected/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-kube-api-access-lb65w\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:29.014561 master-2 kubenswrapper[4762]: I1014 13:39:29.014548 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:29.014561 master-2 kubenswrapper[4762]: I1014 13:39:29.014564 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a04057-57ba-4fce-9a2a-f09e7333ddd9-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:29.020679 master-2 kubenswrapper[4762]: I1014 13:39:29.020615 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:39:29.034772 master-2 kubenswrapper[4762]: I1014 13:39:29.033246 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd59f759-z4zsj"] Oct 14 13:39:29.574719 master-2 kubenswrapper[4762]: I1014 13:39:29.574557 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" path="/var/lib/kubelet/pods/7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a/volumes" Oct 14 13:39:29.656098 master-2 kubenswrapper[4762]: I1014 13:39:29.656044 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7a04057-57ba-4fce-9a2a-f09e7333ddd9","Type":"ContainerDied","Data":"75d8f3ded3fc825a61198aaa67a506bfde480ac8f25f1d79f648b0577ccfe985"} Oct 14 13:39:29.656098 master-2 kubenswrapper[4762]: I1014 13:39:29.656100 4762 scope.go:117] "RemoveContainer" containerID="37f58c4430902f7cc815a2535a71deb9ecb0506e716fc8179301f3bac2def25f" Oct 14 13:39:29.656721 master-2 kubenswrapper[4762]: I1014 13:39:29.656172 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:39:29.681524 master-2 kubenswrapper[4762]: I1014 13:39:29.681471 4762 scope.go:117] "RemoveContainer" containerID="dc764dd0b9815f8afba97ac6c1fa5f82a8f91bdd751d70c869bf014f9115ce83" Oct 14 13:39:29.705180 master-2 kubenswrapper[4762]: I1014 13:39:29.705091 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:29.718017 master-2 kubenswrapper[4762]: I1014 13:39:29.717932 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:29.755176 master-2 kubenswrapper[4762]: I1014 13:39:29.755092 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:29.755440 master-2 kubenswrapper[4762]: E1014 13:39:29.755423 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="init" Oct 14 13:39:29.755440 master-2 kubenswrapper[4762]: I1014 13:39:29.755438 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="init" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: E1014 13:39:29.755466 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: I1014 13:39:29.755473 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: E1014 13:39:29.755484 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: I1014 13:39:29.755492 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: E1014 13:39:29.755514 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="dnsmasq-dns" Oct 14 13:39:29.755597 master-2 kubenswrapper[4762]: I1014 13:39:29.755522 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="dnsmasq-dns" Oct 14 13:39:29.755847 master-2 kubenswrapper[4762]: I1014 13:39:29.755662 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-metadata" Oct 14 13:39:29.755847 master-2 kubenswrapper[4762]: I1014 13:39:29.755675 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" containerName="nova-metadata-log" Oct 14 13:39:29.755847 master-2 kubenswrapper[4762]: I1014 13:39:29.755699 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="dnsmasq-dns" Oct 14 13:39:29.757108 master-2 kubenswrapper[4762]: I1014 13:39:29.757085 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:39:29.759876 master-2 kubenswrapper[4762]: I1014 13:39:29.759817 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 14 13:39:29.760355 master-2 kubenswrapper[4762]: I1014 13:39:29.760321 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:39:29.781707 master-2 kubenswrapper[4762]: I1014 13:39:29.781398 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:29.943620 master-2 kubenswrapper[4762]: I1014 13:39:29.943530 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:29.943620 master-2 kubenswrapper[4762]: I1014 13:39:29.943616 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlv5z\" (UniqueName: \"kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:29.944051 master-2 kubenswrapper[4762]: I1014 13:39:29.943709 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:29.944051 master-2 kubenswrapper[4762]: I1014 13:39:29.943776 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:29.944051 master-2 kubenswrapper[4762]: I1014 13:39:29.943841 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.045960 master-2 kubenswrapper[4762]: I1014 13:39:30.045877 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.045960 master-2 kubenswrapper[4762]: I1014 13:39:30.045948 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlv5z\" (UniqueName: \"kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.045960 master-2 kubenswrapper[4762]: I1014 13:39:30.045984 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.046508 master-2 kubenswrapper[4762]: I1014 13:39:30.046029 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.046508 master-2 kubenswrapper[4762]: I1014 13:39:30.046077 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.046842 master-2 kubenswrapper[4762]: I1014 13:39:30.046781 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.051181 master-2 kubenswrapper[4762]: I1014 13:39:30.050132 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.051181 master-2 kubenswrapper[4762]: I1014 13:39:30.050696 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.067077 master-2 kubenswrapper[4762]: I1014 13:39:30.066995 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.073049 master-2 kubenswrapper[4762]: I1014 13:39:30.072979 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlv5z\" (UniqueName: \"kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z\") pod \"nova-metadata-0\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " pod="openstack/nova-metadata-0" Oct 14 13:39:30.077979 master-2 kubenswrapper[4762]: I1014 13:39:30.077686 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:39:30.542953 master-2 kubenswrapper[4762]: I1014 13:39:30.542873 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:39:30.679087 master-2 kubenswrapper[4762]: I1014 13:39:30.674696 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerStarted","Data":"c22a4b42454ca03bb37802d6072a818bf06fad5410ccbf2a9b52d0e156af41c4"} Oct 14 13:39:30.679087 master-2 kubenswrapper[4762]: I1014 13:39:30.677418 4762 generic.go:334] "Generic (PLEG): container finished" podID="25d692e8-e19a-475b-bc4e-f22508073ffa" containerID="cf540aae01ce2e4d6adf94d12e78961660e6e278cfd2d2b091b4dd99a247a131" exitCode=0 Oct 14 13:39:30.679087 master-2 kubenswrapper[4762]: I1014 13:39:30.677489 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-v24fq" event={"ID":"25d692e8-e19a-475b-bc4e-f22508073ffa","Type":"ContainerDied","Data":"cf540aae01ce2e4d6adf94d12e78961660e6e278cfd2d2b091b4dd99a247a131"} Oct 14 13:39:31.562553 master-2 kubenswrapper[4762]: I1014 13:39:31.562498 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a04057-57ba-4fce-9a2a-f09e7333ddd9" path="/var/lib/kubelet/pods/b7a04057-57ba-4fce-9a2a-f09e7333ddd9/volumes" Oct 14 13:39:31.689812 master-2 kubenswrapper[4762]: I1014 13:39:31.689737 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerStarted","Data":"10213f95de1e9c66e9f277911356ed1f4f72178c0f58354fce1f6bd9357573e2"} Oct 14 13:39:31.690892 master-2 kubenswrapper[4762]: I1014 13:39:31.690843 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerStarted","Data":"fb20d2acb4e9717ddb43b5dd802458df143a653b646009ed3e2a84064763486f"} Oct 14 13:39:31.726764 master-2 kubenswrapper[4762]: I1014 13:39:31.726647 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.726618118 podStartE2EDuration="2.726618118s" podCreationTimestamp="2025-10-14 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:39:31.718627198 +0000 UTC m=+2000.962786357" watchObservedRunningTime="2025-10-14 13:39:31.726618118 +0000 UTC m=+2000.970777277" Oct 14 13:39:32.130569 master-2 kubenswrapper[4762]: I1014 13:39:32.130504 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:32.298000 master-2 kubenswrapper[4762]: I1014 13:39:32.297803 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data\") pod \"25d692e8-e19a-475b-bc4e-f22508073ffa\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " Oct 14 13:39:32.298323 master-2 kubenswrapper[4762]: I1014 13:39:32.298191 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts\") pod \"25d692e8-e19a-475b-bc4e-f22508073ffa\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " Oct 14 13:39:32.298418 master-2 kubenswrapper[4762]: I1014 13:39:32.298372 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4plx7\" (UniqueName: \"kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7\") pod \"25d692e8-e19a-475b-bc4e-f22508073ffa\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " Oct 14 13:39:32.298553 master-2 kubenswrapper[4762]: I1014 13:39:32.298513 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle\") pod \"25d692e8-e19a-475b-bc4e-f22508073ffa\" (UID: \"25d692e8-e19a-475b-bc4e-f22508073ffa\") " Oct 14 13:39:32.304061 master-2 kubenswrapper[4762]: I1014 13:39:32.304002 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts" (OuterVolumeSpecName: "scripts") pod "25d692e8-e19a-475b-bc4e-f22508073ffa" (UID: "25d692e8-e19a-475b-bc4e-f22508073ffa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:32.305041 master-2 kubenswrapper[4762]: I1014 13:39:32.304968 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7" (OuterVolumeSpecName: "kube-api-access-4plx7") pod "25d692e8-e19a-475b-bc4e-f22508073ffa" (UID: "25d692e8-e19a-475b-bc4e-f22508073ffa"). InnerVolumeSpecName "kube-api-access-4plx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:32.320830 master-2 kubenswrapper[4762]: I1014 13:39:32.320757 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data" (OuterVolumeSpecName: "config-data") pod "25d692e8-e19a-475b-bc4e-f22508073ffa" (UID: "25d692e8-e19a-475b-bc4e-f22508073ffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:32.330397 master-2 kubenswrapper[4762]: I1014 13:39:32.330288 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d692e8-e19a-475b-bc4e-f22508073ffa" (UID: "25d692e8-e19a-475b-bc4e-f22508073ffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:32.401454 master-2 kubenswrapper[4762]: I1014 13:39:32.401378 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:32.401454 master-2 kubenswrapper[4762]: I1014 13:39:32.401435 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:32.401454 master-2 kubenswrapper[4762]: I1014 13:39:32.401448 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4plx7\" (UniqueName: \"kubernetes.io/projected/25d692e8-e19a-475b-bc4e-f22508073ffa-kube-api-access-4plx7\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:32.401454 master-2 kubenswrapper[4762]: I1014 13:39:32.401465 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d692e8-e19a-475b-bc4e-f22508073ffa-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:32.703641 master-2 kubenswrapper[4762]: I1014 13:39:32.703538 4762 generic.go:334] "Generic (PLEG): container finished" podID="bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" containerID="827df6fbff6c4750d82e51a1950468ed649de141480999ca39cea24a5908f5a0" exitCode=0 Oct 14 13:39:32.704187 master-2 kubenswrapper[4762]: I1014 13:39:32.703652 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ffpkk" event={"ID":"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6","Type":"ContainerDied","Data":"827df6fbff6c4750d82e51a1950468ed649de141480999ca39cea24a5908f5a0"} Oct 14 13:39:32.706318 master-2 kubenswrapper[4762]: I1014 13:39:32.706284 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-v24fq" Oct 14 13:39:32.706391 master-2 kubenswrapper[4762]: I1014 13:39:32.706258 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-v24fq" event={"ID":"25d692e8-e19a-475b-bc4e-f22508073ffa","Type":"ContainerDied","Data":"cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804"} Oct 14 13:39:32.706391 master-2 kubenswrapper[4762]: I1014 13:39:32.706365 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb9ab7a33bdf260310f32bf412895e5436c0a5ef12f622983c297577d2954804" Oct 14 13:39:33.256480 master-2 kubenswrapper[4762]: I1014 13:39:33.256398 4762 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cd59f759-z4zsj" podUID="7dcbcdc7-d75f-499f-b97b-06e93e5c9c1a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.129.0.161:5353: i/o timeout" Oct 14 13:39:34.111787 master-2 kubenswrapper[4762]: I1014 13:39:34.111723 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:34.252417 master-2 kubenswrapper[4762]: I1014 13:39:34.252361 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle\") pod \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " Oct 14 13:39:34.252743 master-2 kubenswrapper[4762]: I1014 13:39:34.252543 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data\") pod \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " Oct 14 13:39:34.252743 master-2 kubenswrapper[4762]: I1014 13:39:34.252585 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nh6f\" (UniqueName: \"kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f\") pod \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " Oct 14 13:39:34.252743 master-2 kubenswrapper[4762]: I1014 13:39:34.252729 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts\") pod \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\" (UID: \"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6\") " Oct 14 13:39:34.255750 master-2 kubenswrapper[4762]: I1014 13:39:34.255707 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts" (OuterVolumeSpecName: "scripts") pod "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" (UID: "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:34.258362 master-2 kubenswrapper[4762]: I1014 13:39:34.258295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f" (OuterVolumeSpecName: "kube-api-access-4nh6f") pod "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" (UID: "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6"). InnerVolumeSpecName "kube-api-access-4nh6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:39:34.272568 master-2 kubenswrapper[4762]: I1014 13:39:34.272510 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data" (OuterVolumeSpecName: "config-data") pod "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" (UID: "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:34.272786 master-2 kubenswrapper[4762]: I1014 13:39:34.272750 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" (UID: "bb2b1f8b-432d-4e7d-a538-28068e9e0bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:39:34.355809 master-2 kubenswrapper[4762]: I1014 13:39:34.355719 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:34.355809 master-2 kubenswrapper[4762]: I1014 13:39:34.355794 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nh6f\" (UniqueName: \"kubernetes.io/projected/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-kube-api-access-4nh6f\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:34.355809 master-2 kubenswrapper[4762]: I1014 13:39:34.355815 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:34.355809 master-2 kubenswrapper[4762]: I1014 13:39:34.355834 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:39:34.726708 master-2 kubenswrapper[4762]: I1014 13:39:34.726580 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ffpkk" event={"ID":"bb2b1f8b-432d-4e7d-a538-28068e9e0bb6","Type":"ContainerDied","Data":"1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5"} Oct 14 13:39:34.726708 master-2 kubenswrapper[4762]: I1014 13:39:34.726643 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b3f73184f035661c5da59f00ddfdc34f81a403803892c0a59386ce6469638d5" Oct 14 13:39:34.726708 master-2 kubenswrapper[4762]: I1014 13:39:34.726676 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ffpkk" Oct 14 13:39:35.078022 master-2 kubenswrapper[4762]: I1014 13:39:35.077848 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:39:35.078022 master-2 kubenswrapper[4762]: I1014 13:39:35.077942 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:39:40.078765 master-2 kubenswrapper[4762]: I1014 13:39:40.078692 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:39:40.079410 master-2 kubenswrapper[4762]: I1014 13:39:40.078782 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:39:41.096604 master-2 kubenswrapper[4762]: I1014 13:39:41.096503 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.129.0.171:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:39:41.097340 master-2 kubenswrapper[4762]: I1014 13:39:41.097141 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.129.0.171:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:39:42.855822 master-2 kubenswrapper[4762]: I1014 13:39:42.855729 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:39:50.088540 master-2 kubenswrapper[4762]: I1014 13:39:50.088452 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:39:50.090084 master-2 kubenswrapper[4762]: I1014 13:39:50.089995 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:39:50.102506 master-2 kubenswrapper[4762]: I1014 13:39:50.102436 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:39:50.920972 master-2 kubenswrapper[4762]: I1014 13:39:50.920875 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:40:02.564644 master-2 kubenswrapper[4762]: I1014 13:40:02.564358 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:02.566028 master-2 kubenswrapper[4762]: I1014 13:40:02.564719 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-central-agent" containerID="cri-o://b8ed7429b07aa22e78df15760985b2cdce61bba49580b6d98c5fd63278867083" gracePeriod=30 Oct 14 13:40:02.566028 master-2 kubenswrapper[4762]: I1014 13:40:02.564870 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="sg-core" containerID="cri-o://c6171a284684ba32a220ddade705f6e0c440c7b6ece081b2329d94a38bc6ee15" gracePeriod=30 Oct 14 13:40:02.566028 master-2 kubenswrapper[4762]: I1014 13:40:02.564863 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="proxy-httpd" containerID="cri-o://1b5c20d17fda4ad3770b4104fae746eb568bd2a6efe718d901b0d3d3809fa651" gracePeriod=30 Oct 14 13:40:02.566028 master-2 kubenswrapper[4762]: I1014 13:40:02.564921 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-notification-agent" containerID="cri-o://a9e8d6edd96be388c07a9243a3bce483419cf5055c80a43b1b903268c6551d0c" gracePeriod=30 Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030351 4762 generic.go:334] "Generic (PLEG): container finished" podID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerID="1b5c20d17fda4ad3770b4104fae746eb568bd2a6efe718d901b0d3d3809fa651" exitCode=0 Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030399 4762 generic.go:334] "Generic (PLEG): container finished" podID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerID="c6171a284684ba32a220ddade705f6e0c440c7b6ece081b2329d94a38bc6ee15" exitCode=2 Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030409 4762 generic.go:334] "Generic (PLEG): container finished" podID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerID="a9e8d6edd96be388c07a9243a3bce483419cf5055c80a43b1b903268c6551d0c" exitCode=0 Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030419 4762 generic.go:334] "Generic (PLEG): container finished" podID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerID="b8ed7429b07aa22e78df15760985b2cdce61bba49580b6d98c5fd63278867083" exitCode=0 Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030430 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerDied","Data":"1b5c20d17fda4ad3770b4104fae746eb568bd2a6efe718d901b0d3d3809fa651"} Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerDied","Data":"c6171a284684ba32a220ddade705f6e0c440c7b6ece081b2329d94a38bc6ee15"} Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030510 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerDied","Data":"a9e8d6edd96be388c07a9243a3bce483419cf5055c80a43b1b903268c6551d0c"} Oct 14 13:40:03.037260 master-2 kubenswrapper[4762]: I1014 13:40:03.030522 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerDied","Data":"b8ed7429b07aa22e78df15760985b2cdce61bba49580b6d98c5fd63278867083"} Oct 14 13:40:03.563780 master-2 kubenswrapper[4762]: I1014 13:40:03.563707 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622618 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622687 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622717 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622822 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622906 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622948 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.622974 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tr8b\" (UniqueName: \"kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.623041 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.623091 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml\") pod \"6880510a-48a6-48f8-b644-4fd24cff01a0\" (UID: \"6880510a-48a6-48f8-b644-4fd24cff01a0\") " Oct 14 13:40:03.624233 master-2 kubenswrapper[4762]: I1014 13:40:03.624094 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.625842 master-2 kubenswrapper[4762]: I1014 13:40:03.625791 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:40:03.629507 master-2 kubenswrapper[4762]: I1014 13:40:03.627847 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts" (OuterVolumeSpecName: "scripts") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:03.629507 master-2 kubenswrapper[4762]: I1014 13:40:03.628977 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b" (OuterVolumeSpecName: "kube-api-access-2tr8b") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "kube-api-access-2tr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:03.656504 master-2 kubenswrapper[4762]: I1014 13:40:03.655645 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:03.681123 master-2 kubenswrapper[4762]: I1014 13:40:03.678827 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:03.696007 master-2 kubenswrapper[4762]: I1014 13:40:03.695923 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:03.726859 master-2 kubenswrapper[4762]: I1014 13:40:03.726801 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.726859 master-2 kubenswrapper[4762]: I1014 13:40:03.726843 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.726859 master-2 kubenswrapper[4762]: I1014 13:40:03.726857 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6880510a-48a6-48f8-b644-4fd24cff01a0-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.726859 master-2 kubenswrapper[4762]: I1014 13:40:03.726868 4762 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-ceilometer-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.726859 master-2 kubenswrapper[4762]: I1014 13:40:03.726880 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.727352 master-2 kubenswrapper[4762]: I1014 13:40:03.726895 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tr8b\" (UniqueName: \"kubernetes.io/projected/6880510a-48a6-48f8-b644-4fd24cff01a0-kube-api-access-2tr8b\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:03.742778 master-2 kubenswrapper[4762]: I1014 13:40:03.742718 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data" (OuterVolumeSpecName: "config-data") pod "6880510a-48a6-48f8-b644-4fd24cff01a0" (UID: "6880510a-48a6-48f8-b644-4fd24cff01a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:03.829519 master-2 kubenswrapper[4762]: I1014 13:40:03.829443 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6880510a-48a6-48f8-b644-4fd24cff01a0-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:04.048097 master-2 kubenswrapper[4762]: I1014 13:40:04.047935 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6880510a-48a6-48f8-b644-4fd24cff01a0","Type":"ContainerDied","Data":"aa7c5cb00bffa5a1bb9817845ed3788312d478f767b1a42e192bf7b416738303"} Oct 14 13:40:04.048097 master-2 kubenswrapper[4762]: I1014 13:40:04.048046 4762 scope.go:117] "RemoveContainer" containerID="1b5c20d17fda4ad3770b4104fae746eb568bd2a6efe718d901b0d3d3809fa651" Oct 14 13:40:04.048346 master-2 kubenswrapper[4762]: I1014 13:40:04.047966 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:40:04.082715 master-2 kubenswrapper[4762]: I1014 13:40:04.081651 4762 scope.go:117] "RemoveContainer" containerID="c6171a284684ba32a220ddade705f6e0c440c7b6ece081b2329d94a38bc6ee15" Oct 14 13:40:04.119088 master-2 kubenswrapper[4762]: I1014 13:40:04.118897 4762 scope.go:117] "RemoveContainer" containerID="a9e8d6edd96be388c07a9243a3bce483419cf5055c80a43b1b903268c6551d0c" Oct 14 13:40:04.140203 master-2 kubenswrapper[4762]: I1014 13:40:04.140149 4762 scope.go:117] "RemoveContainer" containerID="b8ed7429b07aa22e78df15760985b2cdce61bba49580b6d98c5fd63278867083" Oct 14 13:40:04.164969 master-2 kubenswrapper[4762]: I1014 13:40:04.164902 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:04.174459 master-2 kubenswrapper[4762]: I1014 13:40:04.173573 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:04.210657 master-2 kubenswrapper[4762]: I1014 13:40:04.210598 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:04.211007 master-2 kubenswrapper[4762]: E1014 13:40:04.210984 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="sg-core" Oct 14 13:40:04.211007 master-2 kubenswrapper[4762]: I1014 13:40:04.211004 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="sg-core" Oct 14 13:40:04.211007 master-2 kubenswrapper[4762]: E1014 13:40:04.211021 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="proxy-httpd" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: I1014 13:40:04.211028 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="proxy-httpd" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: E1014 13:40:04.211040 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d692e8-e19a-475b-bc4e-f22508073ffa" containerName="nova-manage" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: I1014 13:40:04.211046 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d692e8-e19a-475b-bc4e-f22508073ffa" containerName="nova-manage" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: E1014 13:40:04.211057 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" containerName="nova-manage" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: I1014 13:40:04.211062 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" containerName="nova-manage" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: E1014 13:40:04.211094 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-central-agent" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: I1014 13:40:04.211100 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-central-agent" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: E1014 13:40:04.211116 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-notification-agent" Oct 14 13:40:04.211200 master-2 kubenswrapper[4762]: I1014 13:40:04.211122 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-notification-agent" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211250 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" containerName="nova-manage" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211269 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="proxy-httpd" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211280 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-central-agent" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211291 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="ceilometer-notification-agent" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211298 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d692e8-e19a-475b-bc4e-f22508073ffa" containerName="nova-manage" Oct 14 13:40:04.211813 master-2 kubenswrapper[4762]: I1014 13:40:04.211306 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" containerName="sg-core" Oct 14 13:40:04.213692 master-2 kubenswrapper[4762]: I1014 13:40:04.213668 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:40:04.217204 master-2 kubenswrapper[4762]: I1014 13:40:04.217124 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:40:04.217691 master-2 kubenswrapper[4762]: I1014 13:40:04.217660 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:40:04.218885 master-2 kubenswrapper[4762]: I1014 13:40:04.218709 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:40:04.234300 master-2 kubenswrapper[4762]: I1014 13:40:04.232233 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:04.239646 master-2 kubenswrapper[4762]: I1014 13:40:04.239561 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.239787 master-2 kubenswrapper[4762]: I1014 13:40:04.239673 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.239787 master-2 kubenswrapper[4762]: I1014 13:40:04.239746 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.239787 master-2 kubenswrapper[4762]: I1014 13:40:04.239778 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.239947 master-2 kubenswrapper[4762]: I1014 13:40:04.239814 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvlk\" (UniqueName: \"kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.239947 master-2 kubenswrapper[4762]: I1014 13:40:04.239877 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.240009 master-2 kubenswrapper[4762]: I1014 13:40:04.239983 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.240096 master-2 kubenswrapper[4762]: I1014 13:40:04.240067 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.341891 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.341953 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.341982 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvlk\" (UniqueName: \"kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.342041 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.342103 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.342175 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.342205 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.342452 master-2 kubenswrapper[4762]: I1014 13:40:04.342236 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.343569 master-2 kubenswrapper[4762]: I1014 13:40:04.342772 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.343569 master-2 kubenswrapper[4762]: I1014 13:40:04.343042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.345510 master-2 kubenswrapper[4762]: I1014 13:40:04.345468 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.346026 master-2 kubenswrapper[4762]: I1014 13:40:04.345986 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.347080 master-2 kubenswrapper[4762]: I1014 13:40:04.347029 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.349234 master-2 kubenswrapper[4762]: I1014 13:40:04.349178 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.349642 master-2 kubenswrapper[4762]: I1014 13:40:04.349592 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.363492 master-2 kubenswrapper[4762]: I1014 13:40:04.363420 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvlk\" (UniqueName: \"kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk\") pod \"ceilometer-0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " pod="openstack/ceilometer-0" Oct 14 13:40:04.551187 master-2 kubenswrapper[4762]: I1014 13:40:04.551075 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:40:05.027678 master-2 kubenswrapper[4762]: W1014 13:40:05.027616 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae5b4b0d_66c2_4e55_b23e_0f29af1a80b0.slice/crio-a600589d404ff53bf340616ae1bab4f94d4f8b89429306510293a23eaa2305ab WatchSource:0}: Error finding container a600589d404ff53bf340616ae1bab4f94d4f8b89429306510293a23eaa2305ab: Status 404 returned error can't find the container with id a600589d404ff53bf340616ae1bab4f94d4f8b89429306510293a23eaa2305ab Oct 14 13:40:05.038033 master-2 kubenswrapper[4762]: I1014 13:40:05.037985 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:40:05.067741 master-2 kubenswrapper[4762]: I1014 13:40:05.067675 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerStarted","Data":"a600589d404ff53bf340616ae1bab4f94d4f8b89429306510293a23eaa2305ab"} Oct 14 13:40:05.562809 master-2 kubenswrapper[4762]: I1014 13:40:05.562733 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6880510a-48a6-48f8-b644-4fd24cff01a0" path="/var/lib/kubelet/pods/6880510a-48a6-48f8-b644-4fd24cff01a0/volumes" Oct 14 13:40:07.109243 master-2 kubenswrapper[4762]: I1014 13:40:07.107082 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerStarted","Data":"acda5e89273912eabb9c1a909e49f0b8b3c272ecfc944377d5ff0358e68140e4"} Oct 14 13:40:07.109243 master-2 kubenswrapper[4762]: I1014 13:40:07.107171 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerStarted","Data":"6e3d974e860c109c425547ff4853218cb06d7569f5a6c0da651dde5092900269"} Oct 14 13:40:07.613071 master-2 kubenswrapper[4762]: I1014 13:40:07.612922 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:07.613405 master-2 kubenswrapper[4762]: I1014 13:40:07.613351 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f597da64-c1c3-4bf5-88e2-25725c313ea9" containerName="nova-scheduler-scheduler" containerID="cri-o://44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0" gracePeriod=30 Oct 14 13:40:08.121114 master-2 kubenswrapper[4762]: I1014 13:40:08.121048 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerStarted","Data":"1c9a03b3d5eb6bb246dd2ebba4e1791b0d240244273ae772094859e45767a627"} Oct 14 13:40:10.145431 master-2 kubenswrapper[4762]: I1014 13:40:10.145357 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerStarted","Data":"ead9443357c1544acf6c79eb7feb6b791687ab5e9969c2ed98b431ec88f2792e"} Oct 14 13:40:10.146863 master-2 kubenswrapper[4762]: I1014 13:40:10.146831 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:40:10.190387 master-2 kubenswrapper[4762]: I1014 13:40:10.190275 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.830623107 podStartE2EDuration="6.190252494s" podCreationTimestamp="2025-10-14 13:40:04 +0000 UTC" firstStartedPulling="2025-10-14 13:40:05.031652519 +0000 UTC m=+2034.275811678" lastFinishedPulling="2025-10-14 13:40:09.391281876 +0000 UTC m=+2038.635441065" observedRunningTime="2025-10-14 13:40:10.186800817 +0000 UTC m=+2039.430959976" watchObservedRunningTime="2025-10-14 13:40:10.190252494 +0000 UTC m=+2039.434411653" Oct 14 13:40:10.815608 master-2 kubenswrapper[4762]: I1014 13:40:10.815563 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:40:10.891586 master-2 kubenswrapper[4762]: I1014 13:40:10.891489 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vthjz\" (UniqueName: \"kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz\") pod \"f597da64-c1c3-4bf5-88e2-25725c313ea9\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " Oct 14 13:40:10.891815 master-2 kubenswrapper[4762]: I1014 13:40:10.891705 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data\") pod \"f597da64-c1c3-4bf5-88e2-25725c313ea9\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " Oct 14 13:40:10.891815 master-2 kubenswrapper[4762]: I1014 13:40:10.891793 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle\") pod \"f597da64-c1c3-4bf5-88e2-25725c313ea9\" (UID: \"f597da64-c1c3-4bf5-88e2-25725c313ea9\") " Oct 14 13:40:10.895678 master-2 kubenswrapper[4762]: I1014 13:40:10.895620 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz" (OuterVolumeSpecName: "kube-api-access-vthjz") pod "f597da64-c1c3-4bf5-88e2-25725c313ea9" (UID: "f597da64-c1c3-4bf5-88e2-25725c313ea9"). InnerVolumeSpecName "kube-api-access-vthjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:10.919049 master-2 kubenswrapper[4762]: I1014 13:40:10.918975 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f597da64-c1c3-4bf5-88e2-25725c313ea9" (UID: "f597da64-c1c3-4bf5-88e2-25725c313ea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:10.930536 master-2 kubenswrapper[4762]: I1014 13:40:10.930447 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data" (OuterVolumeSpecName: "config-data") pod "f597da64-c1c3-4bf5-88e2-25725c313ea9" (UID: "f597da64-c1c3-4bf5-88e2-25725c313ea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:10.995534 master-2 kubenswrapper[4762]: I1014 13:40:10.995275 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:10.995534 master-2 kubenswrapper[4762]: I1014 13:40:10.995337 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597da64-c1c3-4bf5-88e2-25725c313ea9-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:10.995534 master-2 kubenswrapper[4762]: I1014 13:40:10.995361 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vthjz\" (UniqueName: \"kubernetes.io/projected/f597da64-c1c3-4bf5-88e2-25725c313ea9-kube-api-access-vthjz\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:11.157812 master-2 kubenswrapper[4762]: I1014 13:40:11.157732 4762 generic.go:334] "Generic (PLEG): container finished" podID="f597da64-c1c3-4bf5-88e2-25725c313ea9" containerID="44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0" exitCode=0 Oct 14 13:40:11.157812 master-2 kubenswrapper[4762]: I1014 13:40:11.157796 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:40:11.158618 master-2 kubenswrapper[4762]: I1014 13:40:11.157837 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f597da64-c1c3-4bf5-88e2-25725c313ea9","Type":"ContainerDied","Data":"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0"} Oct 14 13:40:11.158618 master-2 kubenswrapper[4762]: I1014 13:40:11.157906 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f597da64-c1c3-4bf5-88e2-25725c313ea9","Type":"ContainerDied","Data":"9863b959117c17338c1951708201e287fa004a910d5bc2c209fef32ce40f7740"} Oct 14 13:40:11.158618 master-2 kubenswrapper[4762]: I1014 13:40:11.157929 4762 scope.go:117] "RemoveContainer" containerID="44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0" Oct 14 13:40:11.200731 master-2 kubenswrapper[4762]: I1014 13:40:11.199683 4762 scope.go:117] "RemoveContainer" containerID="44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0" Oct 14 13:40:11.200731 master-2 kubenswrapper[4762]: E1014 13:40:11.200288 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0\": container with ID starting with 44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0 not found: ID does not exist" containerID="44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0" Oct 14 13:40:11.200731 master-2 kubenswrapper[4762]: I1014 13:40:11.200382 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0"} err="failed to get container status \"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0\": rpc error: code = NotFound desc = could not find container \"44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0\": container with ID starting with 44e27efc802a186eebf5da6ad5d041db267c8206586b72ca7d331d29b166dac0 not found: ID does not exist" Oct 14 13:40:11.229286 master-2 kubenswrapper[4762]: I1014 13:40:11.229186 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:11.236450 master-2 kubenswrapper[4762]: I1014 13:40:11.236377 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:11.260632 master-2 kubenswrapper[4762]: I1014 13:40:11.260451 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:11.260956 master-2 kubenswrapper[4762]: E1014 13:40:11.260932 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f597da64-c1c3-4bf5-88e2-25725c313ea9" containerName="nova-scheduler-scheduler" Oct 14 13:40:11.260956 master-2 kubenswrapper[4762]: I1014 13:40:11.260951 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="f597da64-c1c3-4bf5-88e2-25725c313ea9" containerName="nova-scheduler-scheduler" Oct 14 13:40:11.261409 master-2 kubenswrapper[4762]: I1014 13:40:11.261379 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="f597da64-c1c3-4bf5-88e2-25725c313ea9" containerName="nova-scheduler-scheduler" Oct 14 13:40:11.262363 master-2 kubenswrapper[4762]: I1014 13:40:11.262308 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:40:11.265766 master-2 kubenswrapper[4762]: I1014 13:40:11.265734 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:40:11.278562 master-2 kubenswrapper[4762]: I1014 13:40:11.278500 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:11.403496 master-2 kubenswrapper[4762]: I1014 13:40:11.403361 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.403743 master-2 kubenswrapper[4762]: I1014 13:40:11.403655 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgjx\" (UniqueName: \"kubernetes.io/projected/546d9f4c-5266-474b-abb2-dfcd7a8853ac-kube-api-access-2tgjx\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.403743 master-2 kubenswrapper[4762]: I1014 13:40:11.403724 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-config-data\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.505348 master-2 kubenswrapper[4762]: I1014 13:40:11.505277 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.505563 master-2 kubenswrapper[4762]: I1014 13:40:11.505428 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgjx\" (UniqueName: \"kubernetes.io/projected/546d9f4c-5266-474b-abb2-dfcd7a8853ac-kube-api-access-2tgjx\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.505563 master-2 kubenswrapper[4762]: I1014 13:40:11.505465 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-config-data\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.516032 master-2 kubenswrapper[4762]: I1014 13:40:11.510250 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.516032 master-2 kubenswrapper[4762]: I1014 13:40:11.510414 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 14 13:40:11.519902 master-2 kubenswrapper[4762]: I1014 13:40:11.519836 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d9f4c-5266-474b-abb2-dfcd7a8853ac-config-data\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.538258 master-2 kubenswrapper[4762]: I1014 13:40:11.538149 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgjx\" (UniqueName: \"kubernetes.io/projected/546d9f4c-5266-474b-abb2-dfcd7a8853ac-kube-api-access-2tgjx\") pod \"nova-scheduler-0\" (UID: \"546d9f4c-5266-474b-abb2-dfcd7a8853ac\") " pod="openstack/nova-scheduler-0" Oct 14 13:40:11.562379 master-2 kubenswrapper[4762]: I1014 13:40:11.562324 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f597da64-c1c3-4bf5-88e2-25725c313ea9" path="/var/lib/kubelet/pods/f597da64-c1c3-4bf5-88e2-25725c313ea9/volumes" Oct 14 13:40:11.584135 master-2 kubenswrapper[4762]: I1014 13:40:11.584066 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 14 13:40:12.079959 master-2 kubenswrapper[4762]: I1014 13:40:12.079879 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 14 13:40:12.169649 master-2 kubenswrapper[4762]: I1014 13:40:12.169561 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"546d9f4c-5266-474b-abb2-dfcd7a8853ac","Type":"ContainerStarted","Data":"3959c24d689493fa68cd3bae9f242aa78b9e0261f260131a014d2b3e523b0f05"} Oct 14 13:40:13.182926 master-2 kubenswrapper[4762]: I1014 13:40:13.182829 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"546d9f4c-5266-474b-abb2-dfcd7a8853ac","Type":"ContainerStarted","Data":"9e391cc2448fe482b1ad7527163ed52b2d3c79677fd2502805bb05c4c1114597"} Oct 14 13:40:13.237148 master-2 kubenswrapper[4762]: I1014 13:40:13.237015 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.23698897 podStartE2EDuration="2.23698897s" podCreationTimestamp="2025-10-14 13:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:40:13.216672816 +0000 UTC m=+2042.460832045" watchObservedRunningTime="2025-10-14 13:40:13.23698897 +0000 UTC m=+2042.481148139" Oct 14 13:40:16.584575 master-2 kubenswrapper[4762]: I1014 13:40:16.584495 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 14 13:40:21.585299 master-2 kubenswrapper[4762]: I1014 13:40:21.585104 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 14 13:40:21.616080 master-2 kubenswrapper[4762]: I1014 13:40:21.615429 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 14 13:40:22.325397 master-2 kubenswrapper[4762]: I1014 13:40:22.325337 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 14 13:40:25.777299 master-2 kubenswrapper[4762]: I1014 13:40:25.777205 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:25.778103 master-2 kubenswrapper[4762]: I1014 13:40:25.777459 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-log" containerID="cri-o://fb20d2acb4e9717ddb43b5dd802458df143a653b646009ed3e2a84064763486f" gracePeriod=30 Oct 14 13:40:25.778103 master-2 kubenswrapper[4762]: I1014 13:40:25.777875 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-metadata" containerID="cri-o://10213f95de1e9c66e9f277911356ed1f4f72178c0f58354fce1f6bd9357573e2" gracePeriod=30 Oct 14 13:40:26.336888 master-2 kubenswrapper[4762]: I1014 13:40:26.336799 4762 generic.go:334] "Generic (PLEG): container finished" podID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerID="fb20d2acb4e9717ddb43b5dd802458df143a653b646009ed3e2a84064763486f" exitCode=143 Oct 14 13:40:26.336888 master-2 kubenswrapper[4762]: I1014 13:40:26.336886 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerDied","Data":"fb20d2acb4e9717ddb43b5dd802458df143a653b646009ed3e2a84064763486f"} Oct 14 13:40:26.374120 master-2 kubenswrapper[4762]: I1014 13:40:26.374050 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-nr9l7"] Oct 14 13:40:26.375929 master-2 kubenswrapper[4762]: I1014 13:40:26.375907 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:26.392999 master-2 kubenswrapper[4762]: I1014 13:40:26.392910 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nr9l7"] Oct 14 13:40:26.461040 master-2 kubenswrapper[4762]: I1014 13:40:26.460958 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7txc\" (UniqueName: \"kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc\") pod \"octavia-db-create-nr9l7\" (UID: \"eed550b4-d873-4c4d-888b-393c8198e192\") " pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:26.475717 master-2 kubenswrapper[4762]: I1014 13:40:26.475621 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:26.476035 master-2 kubenswrapper[4762]: I1014 13:40:26.475990 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-log" containerID="cri-o://962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25" gracePeriod=30 Oct 14 13:40:26.476249 master-2 kubenswrapper[4762]: I1014 13:40:26.476107 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-api" containerID="cri-o://fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876" gracePeriod=30 Oct 14 13:40:26.562117 master-2 kubenswrapper[4762]: I1014 13:40:26.562034 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7txc\" (UniqueName: \"kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc\") pod \"octavia-db-create-nr9l7\" (UID: \"eed550b4-d873-4c4d-888b-393c8198e192\") " pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:26.584571 master-2 kubenswrapper[4762]: I1014 13:40:26.584494 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7txc\" (UniqueName: \"kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc\") pod \"octavia-db-create-nr9l7\" (UID: \"eed550b4-d873-4c4d-888b-393c8198e192\") " pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:26.696271 master-2 kubenswrapper[4762]: I1014 13:40:26.696201 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:27.180524 master-2 kubenswrapper[4762]: I1014 13:40:27.180424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nr9l7"] Oct 14 13:40:27.348047 master-2 kubenswrapper[4762]: I1014 13:40:27.348003 4762 generic.go:334] "Generic (PLEG): container finished" podID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerID="962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25" exitCode=143 Oct 14 13:40:27.348252 master-2 kubenswrapper[4762]: I1014 13:40:27.348087 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerDied","Data":"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25"} Oct 14 13:40:27.349598 master-2 kubenswrapper[4762]: I1014 13:40:27.349528 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nr9l7" event={"ID":"eed550b4-d873-4c4d-888b-393c8198e192","Type":"ContainerStarted","Data":"f11ba692eac9f97162483e6c3eedb5ee57ceec47960a7086767539e8c83743a7"} Oct 14 13:40:28.363582 master-2 kubenswrapper[4762]: I1014 13:40:28.363519 4762 generic.go:334] "Generic (PLEG): container finished" podID="eed550b4-d873-4c4d-888b-393c8198e192" containerID="d766502aef6374b3a21849b1f44b7342550719c76e537ea977011993eb5d3379" exitCode=0 Oct 14 13:40:28.364133 master-2 kubenswrapper[4762]: I1014 13:40:28.363661 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nr9l7" event={"ID":"eed550b4-d873-4c4d-888b-393c8198e192","Type":"ContainerDied","Data":"d766502aef6374b3a21849b1f44b7342550719c76e537ea977011993eb5d3379"} Oct 14 13:40:29.375413 master-2 kubenswrapper[4762]: I1014 13:40:29.375338 4762 generic.go:334] "Generic (PLEG): container finished" podID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerID="10213f95de1e9c66e9f277911356ed1f4f72178c0f58354fce1f6bd9357573e2" exitCode=0 Oct 14 13:40:29.376939 master-2 kubenswrapper[4762]: I1014 13:40:29.375444 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerDied","Data":"10213f95de1e9c66e9f277911356ed1f4f72178c0f58354fce1f6bd9357573e2"} Oct 14 13:40:29.376939 master-2 kubenswrapper[4762]: I1014 13:40:29.375516 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0add25e0-b2b4-43c3-8c55-723601ab9432","Type":"ContainerDied","Data":"c22a4b42454ca03bb37802d6072a818bf06fad5410ccbf2a9b52d0e156af41c4"} Oct 14 13:40:29.376939 master-2 kubenswrapper[4762]: I1014 13:40:29.375531 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c22a4b42454ca03bb37802d6072a818bf06fad5410ccbf2a9b52d0e156af41c4" Oct 14 13:40:29.419865 master-2 kubenswrapper[4762]: I1014 13:40:29.419795 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:40:29.436677 master-2 kubenswrapper[4762]: I1014 13:40:29.436386 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs\") pod \"0add25e0-b2b4-43c3-8c55-723601ab9432\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " Oct 14 13:40:29.436677 master-2 kubenswrapper[4762]: I1014 13:40:29.436495 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs\") pod \"0add25e0-b2b4-43c3-8c55-723601ab9432\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " Oct 14 13:40:29.436677 master-2 kubenswrapper[4762]: I1014 13:40:29.436612 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data\") pod \"0add25e0-b2b4-43c3-8c55-723601ab9432\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " Oct 14 13:40:29.436677 master-2 kubenswrapper[4762]: I1014 13:40:29.436674 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle\") pod \"0add25e0-b2b4-43c3-8c55-723601ab9432\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " Oct 14 13:40:29.437074 master-2 kubenswrapper[4762]: I1014 13:40:29.436824 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlv5z\" (UniqueName: \"kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z\") pod \"0add25e0-b2b4-43c3-8c55-723601ab9432\" (UID: \"0add25e0-b2b4-43c3-8c55-723601ab9432\") " Oct 14 13:40:29.437259 master-2 kubenswrapper[4762]: I1014 13:40:29.437214 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs" (OuterVolumeSpecName: "logs") pod "0add25e0-b2b4-43c3-8c55-723601ab9432" (UID: "0add25e0-b2b4-43c3-8c55-723601ab9432"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:40:29.441250 master-2 kubenswrapper[4762]: I1014 13:40:29.441090 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z" (OuterVolumeSpecName: "kube-api-access-hlv5z") pod "0add25e0-b2b4-43c3-8c55-723601ab9432" (UID: "0add25e0-b2b4-43c3-8c55-723601ab9432"). InnerVolumeSpecName "kube-api-access-hlv5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:29.474283 master-2 kubenswrapper[4762]: I1014 13:40:29.469745 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data" (OuterVolumeSpecName: "config-data") pod "0add25e0-b2b4-43c3-8c55-723601ab9432" (UID: "0add25e0-b2b4-43c3-8c55-723601ab9432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:29.499987 master-2 kubenswrapper[4762]: I1014 13:40:29.494869 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0add25e0-b2b4-43c3-8c55-723601ab9432" (UID: "0add25e0-b2b4-43c3-8c55-723601ab9432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:29.499987 master-2 kubenswrapper[4762]: I1014 13:40:29.499335 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0add25e0-b2b4-43c3-8c55-723601ab9432" (UID: "0add25e0-b2b4-43c3-8c55-723601ab9432"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:29.539328 master-2 kubenswrapper[4762]: I1014 13:40:29.539261 4762 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-nova-metadata-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:29.539328 master-2 kubenswrapper[4762]: I1014 13:40:29.539309 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0add25e0-b2b4-43c3-8c55-723601ab9432-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:29.539328 master-2 kubenswrapper[4762]: I1014 13:40:29.539323 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:29.539328 master-2 kubenswrapper[4762]: I1014 13:40:29.539337 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0add25e0-b2b4-43c3-8c55-723601ab9432-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:29.539328 master-2 kubenswrapper[4762]: I1014 13:40:29.539347 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlv5z\" (UniqueName: \"kubernetes.io/projected/0add25e0-b2b4-43c3-8c55-723601ab9432-kube-api-access-hlv5z\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:29.724718 master-2 kubenswrapper[4762]: I1014 13:40:29.724682 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:29.743547 master-2 kubenswrapper[4762]: I1014 13:40:29.743460 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7txc\" (UniqueName: \"kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc\") pod \"eed550b4-d873-4c4d-888b-393c8198e192\" (UID: \"eed550b4-d873-4c4d-888b-393c8198e192\") " Oct 14 13:40:29.747736 master-2 kubenswrapper[4762]: I1014 13:40:29.747674 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc" (OuterVolumeSpecName: "kube-api-access-w7txc") pod "eed550b4-d873-4c4d-888b-393c8198e192" (UID: "eed550b4-d873-4c4d-888b-393c8198e192"). InnerVolumeSpecName "kube-api-access-w7txc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:29.847286 master-2 kubenswrapper[4762]: I1014 13:40:29.847200 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7txc\" (UniqueName: \"kubernetes.io/projected/eed550b4-d873-4c4d-888b-393c8198e192-kube-api-access-w7txc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:30.080199 master-2 kubenswrapper[4762]: I1014 13:40:30.080114 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:40:30.164606 master-2 kubenswrapper[4762]: I1014 13:40:30.164538 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv8kj\" (UniqueName: \"kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj\") pod \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " Oct 14 13:40:30.165124 master-2 kubenswrapper[4762]: I1014 13:40:30.165097 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data\") pod \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " Oct 14 13:40:30.165297 master-2 kubenswrapper[4762]: I1014 13:40:30.165277 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle\") pod \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " Oct 14 13:40:30.165523 master-2 kubenswrapper[4762]: I1014 13:40:30.165501 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs\") pod \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\" (UID: \"7a37f47c-bdf2-4fc0-89a6-622a690a7a41\") " Oct 14 13:40:30.167877 master-2 kubenswrapper[4762]: I1014 13:40:30.167848 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs" (OuterVolumeSpecName: "logs") pod "7a37f47c-bdf2-4fc0-89a6-622a690a7a41" (UID: "7a37f47c-bdf2-4fc0-89a6-622a690a7a41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:40:30.169020 master-2 kubenswrapper[4762]: I1014 13:40:30.168924 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj" (OuterVolumeSpecName: "kube-api-access-xv8kj") pod "7a37f47c-bdf2-4fc0-89a6-622a690a7a41" (UID: "7a37f47c-bdf2-4fc0-89a6-622a690a7a41"). InnerVolumeSpecName "kube-api-access-xv8kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:30.191071 master-2 kubenswrapper[4762]: I1014 13:40:30.191007 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a37f47c-bdf2-4fc0-89a6-622a690a7a41" (UID: "7a37f47c-bdf2-4fc0-89a6-622a690a7a41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:30.191648 master-2 kubenswrapper[4762]: I1014 13:40:30.191607 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data" (OuterVolumeSpecName: "config-data") pod "7a37f47c-bdf2-4fc0-89a6-622a690a7a41" (UID: "7a37f47c-bdf2-4fc0-89a6-622a690a7a41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:40:30.268561 master-2 kubenswrapper[4762]: I1014 13:40:30.268465 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:30.268561 master-2 kubenswrapper[4762]: I1014 13:40:30.268506 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:30.268561 master-2 kubenswrapper[4762]: I1014 13:40:30.268516 4762 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-logs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:30.268561 master-2 kubenswrapper[4762]: I1014 13:40:30.268526 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv8kj\" (UniqueName: \"kubernetes.io/projected/7a37f47c-bdf2-4fc0-89a6-622a690a7a41-kube-api-access-xv8kj\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:30.390116 master-2 kubenswrapper[4762]: I1014 13:40:30.389914 4762 generic.go:334] "Generic (PLEG): container finished" podID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerID="fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876" exitCode=0 Oct 14 13:40:30.390116 master-2 kubenswrapper[4762]: I1014 13:40:30.390027 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:40:30.391120 master-2 kubenswrapper[4762]: I1014 13:40:30.390028 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerDied","Data":"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876"} Oct 14 13:40:30.391120 master-2 kubenswrapper[4762]: I1014 13:40:30.390202 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a37f47c-bdf2-4fc0-89a6-622a690a7a41","Type":"ContainerDied","Data":"d5de9068b3772adbdeee5272078590424cbc4f6bdbaebb5f7ed07df02ee4baf6"} Oct 14 13:40:30.391120 master-2 kubenswrapper[4762]: I1014 13:40:30.390233 4762 scope.go:117] "RemoveContainer" containerID="fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876" Oct 14 13:40:30.393621 master-2 kubenswrapper[4762]: I1014 13:40:30.393348 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nr9l7" event={"ID":"eed550b4-d873-4c4d-888b-393c8198e192","Type":"ContainerDied","Data":"f11ba692eac9f97162483e6c3eedb5ee57ceec47960a7086767539e8c83743a7"} Oct 14 13:40:30.393621 master-2 kubenswrapper[4762]: I1014 13:40:30.393398 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11ba692eac9f97162483e6c3eedb5ee57ceec47960a7086767539e8c83743a7" Oct 14 13:40:30.393621 master-2 kubenswrapper[4762]: I1014 13:40:30.393429 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:40:30.393621 master-2 kubenswrapper[4762]: I1014 13:40:30.393471 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nr9l7" Oct 14 13:40:30.421835 master-2 kubenswrapper[4762]: I1014 13:40:30.421774 4762 scope.go:117] "RemoveContainer" containerID="962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25" Oct 14 13:40:30.439058 master-2 kubenswrapper[4762]: I1014 13:40:30.438784 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:30.443864 master-2 kubenswrapper[4762]: I1014 13:40:30.443811 4762 scope.go:117] "RemoveContainer" containerID="fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876" Oct 14 13:40:30.444337 master-2 kubenswrapper[4762]: E1014 13:40:30.444263 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876\": container with ID starting with fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876 not found: ID does not exist" containerID="fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876" Oct 14 13:40:30.444337 master-2 kubenswrapper[4762]: I1014 13:40:30.444321 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876"} err="failed to get container status \"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876\": rpc error: code = NotFound desc = could not find container \"fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876\": container with ID starting with fd38518a8d913bd9973e5fad2f3a437b01d5f23e0b0f1fffbdd8b8b2c14b7876 not found: ID does not exist" Oct 14 13:40:30.444630 master-2 kubenswrapper[4762]: I1014 13:40:30.444356 4762 scope.go:117] "RemoveContainer" containerID="962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25" Oct 14 13:40:30.444870 master-2 kubenswrapper[4762]: E1014 13:40:30.444771 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25\": container with ID starting with 962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25 not found: ID does not exist" containerID="962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25" Oct 14 13:40:30.444870 master-2 kubenswrapper[4762]: I1014 13:40:30.444844 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25"} err="failed to get container status \"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25\": rpc error: code = NotFound desc = could not find container \"962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25\": container with ID starting with 962acd3ff0b6ad48e4b9d2b9df75fe1ea84024e2516befa75202ec1546a88f25 not found: ID does not exist" Oct 14 13:40:30.450335 master-2 kubenswrapper[4762]: I1014 13:40:30.450029 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:30.468447 master-2 kubenswrapper[4762]: I1014 13:40:30.468272 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:30.476307 master-2 kubenswrapper[4762]: I1014 13:40:30.476144 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:30.482630 master-2 kubenswrapper[4762]: I1014 13:40:30.482538 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: E1014 13:40:30.482858 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed550b4-d873-4c4d-888b-393c8198e192" containerName="mariadb-database-create" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: I1014 13:40:30.482872 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed550b4-d873-4c4d-888b-393c8198e192" containerName="mariadb-database-create" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: E1014 13:40:30.482880 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-log" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: I1014 13:40:30.482887 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-log" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: E1014 13:40:30.482906 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-api" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: I1014 13:40:30.482912 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-api" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: E1014 13:40:30.482929 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-metadata" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: I1014 13:40:30.482934 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-metadata" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: E1014 13:40:30.482942 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-log" Oct 14 13:40:30.482984 master-2 kubenswrapper[4762]: I1014 13:40:30.482948 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-log" Oct 14 13:40:30.483333 master-2 kubenswrapper[4762]: I1014 13:40:30.483084 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-metadata" Oct 14 13:40:30.483333 master-2 kubenswrapper[4762]: I1014 13:40:30.483094 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-log" Oct 14 13:40:30.483333 master-2 kubenswrapper[4762]: I1014 13:40:30.483109 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" containerName="nova-api-api" Oct 14 13:40:30.483333 master-2 kubenswrapper[4762]: I1014 13:40:30.483121 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" containerName="nova-metadata-log" Oct 14 13:40:30.483333 master-2 kubenswrapper[4762]: I1014 13:40:30.483129 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed550b4-d873-4c4d-888b-393c8198e192" containerName="mariadb-database-create" Oct 14 13:40:30.484420 master-2 kubenswrapper[4762]: I1014 13:40:30.484382 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:40:30.488493 master-2 kubenswrapper[4762]: I1014 13:40:30.488120 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 14 13:40:30.491401 master-2 kubenswrapper[4762]: I1014 13:40:30.488794 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 14 13:40:30.499402 master-2 kubenswrapper[4762]: I1014 13:40:30.499335 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:30.513318 master-2 kubenswrapper[4762]: I1014 13:40:30.512996 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:30.514708 master-2 kubenswrapper[4762]: I1014 13:40:30.514664 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:40:30.518768 master-2 kubenswrapper[4762]: I1014 13:40:30.518722 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 14 13:40:30.518980 master-2 kubenswrapper[4762]: I1014 13:40:30.518944 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 14 13:40:30.519108 master-2 kubenswrapper[4762]: I1014 13:40:30.519072 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 14 13:40:30.550948 master-2 kubenswrapper[4762]: I1014 13:40:30.550879 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:30.575891 master-2 kubenswrapper[4762]: I1014 13:40:30.575809 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-logs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.575891 master-2 kubenswrapper[4762]: I1014 13:40:30.575890 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-config-data\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.576352 master-2 kubenswrapper[4762]: I1014 13:40:30.576044 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.576352 master-2 kubenswrapper[4762]: I1014 13:40:30.576144 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-logs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.576518 master-2 kubenswrapper[4762]: I1014 13:40:30.576464 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.576694 master-2 kubenswrapper[4762]: I1014 13:40:30.576665 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-config-data\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.576920 master-2 kubenswrapper[4762]: I1014 13:40:30.576874 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2pt\" (UniqueName: \"kubernetes.io/projected/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-kube-api-access-zd2pt\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.576972 master-2 kubenswrapper[4762]: I1014 13:40:30.576961 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t254\" (UniqueName: \"kubernetes.io/projected/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-kube-api-access-6t254\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.577818 master-2 kubenswrapper[4762]: I1014 13:40:30.577780 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.578040 master-2 kubenswrapper[4762]: I1014 13:40:30.578007 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.578087 master-2 kubenswrapper[4762]: I1014 13:40:30.578040 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-public-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.680709 master-2 kubenswrapper[4762]: I1014 13:40:30.680604 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t254\" (UniqueName: \"kubernetes.io/projected/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-kube-api-access-6t254\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.680709 master-2 kubenswrapper[4762]: I1014 13:40:30.680720 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680814 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-public-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680857 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-logs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680882 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-config-data\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680903 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680938 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-logs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680967 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.680999 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-config-data\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.681051 master-2 kubenswrapper[4762]: I1014 13:40:30.681036 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2pt\" (UniqueName: \"kubernetes.io/projected/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-kube-api-access-zd2pt\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.683734 master-2 kubenswrapper[4762]: I1014 13:40:30.683691 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-logs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.684009 master-2 kubenswrapper[4762]: I1014 13:40:30.683969 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-logs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.686723 master-2 kubenswrapper[4762]: I1014 13:40:30.686672 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.687633 master-2 kubenswrapper[4762]: I1014 13:40:30.687389 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.687633 master-2 kubenswrapper[4762]: I1014 13:40:30.687578 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.688467 master-2 kubenswrapper[4762]: I1014 13:40:30.688378 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-config-data\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.688779 master-2 kubenswrapper[4762]: I1014 13:40:30.688720 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-public-tls-certs\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.689320 master-2 kubenswrapper[4762]: I1014 13:40:30.689274 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.689531 master-2 kubenswrapper[4762]: I1014 13:40:30.689483 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-config-data\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.703800 master-2 kubenswrapper[4762]: I1014 13:40:30.703736 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t254\" (UniqueName: \"kubernetes.io/projected/d05629bd-0551-437b-ac2a-aa3cf8a20e9f-kube-api-access-6t254\") pod \"nova-api-0\" (UID: \"d05629bd-0551-437b-ac2a-aa3cf8a20e9f\") " pod="openstack/nova-api-0" Oct 14 13:40:30.711372 master-2 kubenswrapper[4762]: I1014 13:40:30.711324 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2pt\" (UniqueName: \"kubernetes.io/projected/bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a-kube-api-access-zd2pt\") pod \"nova-metadata-0\" (UID: \"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a\") " pod="openstack/nova-metadata-0" Oct 14 13:40:30.825809 master-2 kubenswrapper[4762]: I1014 13:40:30.825726 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 14 13:40:30.841117 master-2 kubenswrapper[4762]: I1014 13:40:30.841013 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 14 13:40:31.270347 master-2 kubenswrapper[4762]: I1014 13:40:31.269981 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 14 13:40:31.278184 master-2 kubenswrapper[4762]: W1014 13:40:31.278117 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc7cf7fc_ba40_40a9_9027_cb95f3c7d03a.slice/crio-7ca0312285f6175433258e3c22949195a82721ee217b8a3972a53f530ba1de88 WatchSource:0}: Error finding container 7ca0312285f6175433258e3c22949195a82721ee217b8a3972a53f530ba1de88: Status 404 returned error can't find the container with id 7ca0312285f6175433258e3c22949195a82721ee217b8a3972a53f530ba1de88 Oct 14 13:40:31.403739 master-2 kubenswrapper[4762]: I1014 13:40:31.403673 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 14 13:40:31.409677 master-2 kubenswrapper[4762]: I1014 13:40:31.406214 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a","Type":"ContainerStarted","Data":"7ca0312285f6175433258e3c22949195a82721ee217b8a3972a53f530ba1de88"} Oct 14 13:40:31.412729 master-2 kubenswrapper[4762]: W1014 13:40:31.412660 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd05629bd_0551_437b_ac2a_aa3cf8a20e9f.slice/crio-56f98460f6b0bed2460f9f18e0485446d92dabb654248396352a58e388bb425b WatchSource:0}: Error finding container 56f98460f6b0bed2460f9f18e0485446d92dabb654248396352a58e388bb425b: Status 404 returned error can't find the container with id 56f98460f6b0bed2460f9f18e0485446d92dabb654248396352a58e388bb425b Oct 14 13:40:31.566210 master-2 kubenswrapper[4762]: I1014 13:40:31.563726 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0add25e0-b2b4-43c3-8c55-723601ab9432" path="/var/lib/kubelet/pods/0add25e0-b2b4-43c3-8c55-723601ab9432/volumes" Oct 14 13:40:31.566210 master-2 kubenswrapper[4762]: I1014 13:40:31.565378 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a37f47c-bdf2-4fc0-89a6-622a690a7a41" path="/var/lib/kubelet/pods/7a37f47c-bdf2-4fc0-89a6-622a690a7a41/volumes" Oct 14 13:40:32.416400 master-2 kubenswrapper[4762]: I1014 13:40:32.416305 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a","Type":"ContainerStarted","Data":"204a0d34c22d6c0ae23bcbb4fa8f87564efabbb1216937fb5c1db8b0bcad9d5f"} Oct 14 13:40:32.417034 master-2 kubenswrapper[4762]: I1014 13:40:32.417014 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a","Type":"ContainerStarted","Data":"36d67aef9a1f62077b593bb873b03fed8ed85aa2f860fb9763cbfdaffc2d1762"} Oct 14 13:40:32.421627 master-2 kubenswrapper[4762]: I1014 13:40:32.421569 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05629bd-0551-437b-ac2a-aa3cf8a20e9f","Type":"ContainerStarted","Data":"197d19219fa9730bdfeeaa7c1f221c304f99450801da3c8980dcc2f0b80ad68a"} Oct 14 13:40:32.421767 master-2 kubenswrapper[4762]: I1014 13:40:32.421626 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05629bd-0551-437b-ac2a-aa3cf8a20e9f","Type":"ContainerStarted","Data":"18161e369315540931cf205c198e646bba78327376b1d7d7f8c08c8bc4ca22cf"} Oct 14 13:40:32.421767 master-2 kubenswrapper[4762]: I1014 13:40:32.421647 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05629bd-0551-437b-ac2a-aa3cf8a20e9f","Type":"ContainerStarted","Data":"56f98460f6b0bed2460f9f18e0485446d92dabb654248396352a58e388bb425b"} Oct 14 13:40:32.455063 master-2 kubenswrapper[4762]: I1014 13:40:32.454978 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4549525 podStartE2EDuration="2.4549525s" podCreationTimestamp="2025-10-14 13:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:40:32.445503265 +0000 UTC m=+2061.689662434" watchObservedRunningTime="2025-10-14 13:40:32.4549525 +0000 UTC m=+2061.699111659" Oct 14 13:40:32.483132 master-2 kubenswrapper[4762]: I1014 13:40:32.483006 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.482976034 podStartE2EDuration="2.482976034s" podCreationTimestamp="2025-10-14 13:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:40:32.475859391 +0000 UTC m=+2061.720018560" watchObservedRunningTime="2025-10-14 13:40:32.482976034 +0000 UTC m=+2061.727135203" Oct 14 13:40:34.564821 master-2 kubenswrapper[4762]: I1014 13:40:34.564764 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:40:35.826004 master-2 kubenswrapper[4762]: I1014 13:40:35.825853 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:40:35.826004 master-2 kubenswrapper[4762]: I1014 13:40:35.825954 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 14 13:40:37.119396 master-2 kubenswrapper[4762]: I1014 13:40:37.119327 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c846-account-create-svw8b"] Oct 14 13:40:37.121000 master-2 kubenswrapper[4762]: I1014 13:40:37.120724 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:37.124204 master-2 kubenswrapper[4762]: I1014 13:40:37.123910 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 14 13:40:37.135274 master-2 kubenswrapper[4762]: I1014 13:40:37.134973 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c846-account-create-svw8b"] Oct 14 13:40:37.228695 master-2 kubenswrapper[4762]: I1014 13:40:37.228626 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdbt\" (UniqueName: \"kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt\") pod \"octavia-c846-account-create-svw8b\" (UID: \"680e6d7c-db69-400d-ae0d-48b947c3f9ce\") " pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:37.330934 master-2 kubenswrapper[4762]: I1014 13:40:37.330866 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdbt\" (UniqueName: \"kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt\") pod \"octavia-c846-account-create-svw8b\" (UID: \"680e6d7c-db69-400d-ae0d-48b947c3f9ce\") " pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:37.351602 master-2 kubenswrapper[4762]: I1014 13:40:37.351522 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdbt\" (UniqueName: \"kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt\") pod \"octavia-c846-account-create-svw8b\" (UID: \"680e6d7c-db69-400d-ae0d-48b947c3f9ce\") " pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:37.447856 master-2 kubenswrapper[4762]: I1014 13:40:37.447758 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:37.939149 master-2 kubenswrapper[4762]: I1014 13:40:37.939106 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c846-account-create-svw8b"] Oct 14 13:40:37.947473 master-2 kubenswrapper[4762]: W1014 13:40:37.947414 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680e6d7c_db69_400d_ae0d_48b947c3f9ce.slice/crio-7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a WatchSource:0}: Error finding container 7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a: Status 404 returned error can't find the container with id 7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a Oct 14 13:40:38.488180 master-2 kubenswrapper[4762]: I1014 13:40:38.488007 4762 generic.go:334] "Generic (PLEG): container finished" podID="680e6d7c-db69-400d-ae0d-48b947c3f9ce" containerID="338d30a71818f78d0b693c8bea7531f50bff904cec21d1a99cf096d8462a25e6" exitCode=0 Oct 14 13:40:38.488180 master-2 kubenswrapper[4762]: I1014 13:40:38.488069 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c846-account-create-svw8b" event={"ID":"680e6d7c-db69-400d-ae0d-48b947c3f9ce","Type":"ContainerDied","Data":"338d30a71818f78d0b693c8bea7531f50bff904cec21d1a99cf096d8462a25e6"} Oct 14 13:40:38.488180 master-2 kubenswrapper[4762]: I1014 13:40:38.488096 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c846-account-create-svw8b" event={"ID":"680e6d7c-db69-400d-ae0d-48b947c3f9ce","Type":"ContainerStarted","Data":"7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a"} Oct 14 13:40:39.981033 master-2 kubenswrapper[4762]: I1014 13:40:39.980478 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:40.083760 master-2 kubenswrapper[4762]: I1014 13:40:40.083654 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdbt\" (UniqueName: \"kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt\") pod \"680e6d7c-db69-400d-ae0d-48b947c3f9ce\" (UID: \"680e6d7c-db69-400d-ae0d-48b947c3f9ce\") " Oct 14 13:40:40.089009 master-2 kubenswrapper[4762]: I1014 13:40:40.088923 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt" (OuterVolumeSpecName: "kube-api-access-xwdbt") pod "680e6d7c-db69-400d-ae0d-48b947c3f9ce" (UID: "680e6d7c-db69-400d-ae0d-48b947c3f9ce"). InnerVolumeSpecName "kube-api-access-xwdbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:40.187554 master-2 kubenswrapper[4762]: I1014 13:40:40.187457 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdbt\" (UniqueName: \"kubernetes.io/projected/680e6d7c-db69-400d-ae0d-48b947c3f9ce-kube-api-access-xwdbt\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:40.507947 master-2 kubenswrapper[4762]: I1014 13:40:40.507827 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c846-account-create-svw8b" event={"ID":"680e6d7c-db69-400d-ae0d-48b947c3f9ce","Type":"ContainerDied","Data":"7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a"} Oct 14 13:40:40.508209 master-2 kubenswrapper[4762]: I1014 13:40:40.508191 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7013b58b5d2ecdb20d6a110118a8e5e7963a7b6a6b3e40c9d0fa57a5bf5ebf8a" Oct 14 13:40:40.508304 master-2 kubenswrapper[4762]: I1014 13:40:40.507917 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c846-account-create-svw8b" Oct 14 13:40:40.826600 master-2 kubenswrapper[4762]: I1014 13:40:40.826444 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:40:40.827003 master-2 kubenswrapper[4762]: I1014 13:40:40.826979 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 14 13:40:40.843176 master-2 kubenswrapper[4762]: I1014 13:40:40.841777 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:40:40.843176 master-2 kubenswrapper[4762]: I1014 13:40:40.841839 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 14 13:40:41.844429 master-2 kubenswrapper[4762]: I1014 13:40:41.844340 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.129.0.175:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:40:41.845097 master-2 kubenswrapper[4762]: I1014 13:40:41.844313 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bc7cf7fc-ba40-40a9-9027-cb95f3c7d03a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.129.0.175:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:40:41.860324 master-2 kubenswrapper[4762]: I1014 13:40:41.860264 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d05629bd-0551-437b-ac2a-aa3cf8a20e9f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.129.0.176:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:40:41.860324 master-2 kubenswrapper[4762]: I1014 13:40:41.860326 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d05629bd-0551-437b-ac2a-aa3cf8a20e9f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.129.0.176:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 14 13:40:42.983449 master-2 kubenswrapper[4762]: I1014 13:40:42.983370 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-kk5cv"] Oct 14 13:40:42.983982 master-2 kubenswrapper[4762]: E1014 13:40:42.983679 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680e6d7c-db69-400d-ae0d-48b947c3f9ce" containerName="mariadb-account-create" Oct 14 13:40:42.983982 master-2 kubenswrapper[4762]: I1014 13:40:42.983693 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="680e6d7c-db69-400d-ae0d-48b947c3f9ce" containerName="mariadb-account-create" Oct 14 13:40:42.983982 master-2 kubenswrapper[4762]: I1014 13:40:42.983849 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="680e6d7c-db69-400d-ae0d-48b947c3f9ce" containerName="mariadb-account-create" Oct 14 13:40:42.984655 master-2 kubenswrapper[4762]: I1014 13:40:42.984625 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:42.998687 master-2 kubenswrapper[4762]: I1014 13:40:42.998642 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kk5cv"] Oct 14 13:40:43.053296 master-2 kubenswrapper[4762]: I1014 13:40:43.053188 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdz9\" (UniqueName: \"kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9\") pod \"octavia-persistence-db-create-kk5cv\" (UID: \"b793f469-b40b-47d2-a91c-5e8d4e9df87e\") " pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:43.155580 master-2 kubenswrapper[4762]: I1014 13:40:43.155506 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdz9\" (UniqueName: \"kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9\") pod \"octavia-persistence-db-create-kk5cv\" (UID: \"b793f469-b40b-47d2-a91c-5e8d4e9df87e\") " pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:43.179602 master-2 kubenswrapper[4762]: I1014 13:40:43.179519 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdz9\" (UniqueName: \"kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9\") pod \"octavia-persistence-db-create-kk5cv\" (UID: \"b793f469-b40b-47d2-a91c-5e8d4e9df87e\") " pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:43.300699 master-2 kubenswrapper[4762]: I1014 13:40:43.300493 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:43.767933 master-2 kubenswrapper[4762]: I1014 13:40:43.767875 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kk5cv"] Oct 14 13:40:44.548618 master-2 kubenswrapper[4762]: I1014 13:40:44.548535 4762 generic.go:334] "Generic (PLEG): container finished" podID="b793f469-b40b-47d2-a91c-5e8d4e9df87e" containerID="d48be3acb914a9f94b263a422ff616d7d49a67feef08588b866f49fad73c772c" exitCode=0 Oct 14 13:40:44.548618 master-2 kubenswrapper[4762]: I1014 13:40:44.548608 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kk5cv" event={"ID":"b793f469-b40b-47d2-a91c-5e8d4e9df87e","Type":"ContainerDied","Data":"d48be3acb914a9f94b263a422ff616d7d49a67feef08588b866f49fad73c772c"} Oct 14 13:40:44.549305 master-2 kubenswrapper[4762]: I1014 13:40:44.548655 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kk5cv" event={"ID":"b793f469-b40b-47d2-a91c-5e8d4e9df87e","Type":"ContainerStarted","Data":"3d066e8bfbac4c1262ff9fd5b8b2a4fe9ec71ee87a2385c21896b0749c6e075f"} Oct 14 13:40:45.964926 master-2 kubenswrapper[4762]: I1014 13:40:45.964846 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:46.022608 master-2 kubenswrapper[4762]: I1014 13:40:46.022518 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdz9\" (UniqueName: \"kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9\") pod \"b793f469-b40b-47d2-a91c-5e8d4e9df87e\" (UID: \"b793f469-b40b-47d2-a91c-5e8d4e9df87e\") " Oct 14 13:40:46.027519 master-2 kubenswrapper[4762]: I1014 13:40:46.027462 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9" (OuterVolumeSpecName: "kube-api-access-xmdz9") pod "b793f469-b40b-47d2-a91c-5e8d4e9df87e" (UID: "b793f469-b40b-47d2-a91c-5e8d4e9df87e"). InnerVolumeSpecName "kube-api-access-xmdz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:46.125721 master-2 kubenswrapper[4762]: I1014 13:40:46.125622 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdz9\" (UniqueName: \"kubernetes.io/projected/b793f469-b40b-47d2-a91c-5e8d4e9df87e-kube-api-access-xmdz9\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:46.574114 master-2 kubenswrapper[4762]: I1014 13:40:46.574022 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kk5cv" event={"ID":"b793f469-b40b-47d2-a91c-5e8d4e9df87e","Type":"ContainerDied","Data":"3d066e8bfbac4c1262ff9fd5b8b2a4fe9ec71ee87a2385c21896b0749c6e075f"} Oct 14 13:40:46.574114 master-2 kubenswrapper[4762]: I1014 13:40:46.574088 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d066e8bfbac4c1262ff9fd5b8b2a4fe9ec71ee87a2385c21896b0749c6e075f" Oct 14 13:40:46.574718 master-2 kubenswrapper[4762]: I1014 13:40:46.574130 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kk5cv" Oct 14 13:40:50.835285 master-2 kubenswrapper[4762]: I1014 13:40:50.835213 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:40:50.836520 master-2 kubenswrapper[4762]: I1014 13:40:50.835805 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 14 13:40:50.844645 master-2 kubenswrapper[4762]: I1014 13:40:50.844576 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:40:50.849667 master-2 kubenswrapper[4762]: I1014 13:40:50.849568 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:40:50.850305 master-2 kubenswrapper[4762]: I1014 13:40:50.850225 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:40:50.853001 master-2 kubenswrapper[4762]: I1014 13:40:50.852944 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 14 13:40:50.860251 master-2 kubenswrapper[4762]: I1014 13:40:50.860183 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:40:51.649717 master-2 kubenswrapper[4762]: I1014 13:40:51.649573 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 14 13:40:51.657244 master-2 kubenswrapper[4762]: I1014 13:40:51.657180 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 14 13:40:51.658387 master-2 kubenswrapper[4762]: I1014 13:40:51.658326 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 14 13:40:53.655455 master-2 kubenswrapper[4762]: I1014 13:40:53.655362 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-094b-account-create-82gt8"] Oct 14 13:40:53.656088 master-2 kubenswrapper[4762]: E1014 13:40:53.655881 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b793f469-b40b-47d2-a91c-5e8d4e9df87e" containerName="mariadb-database-create" Oct 14 13:40:53.656088 master-2 kubenswrapper[4762]: I1014 13:40:53.655913 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="b793f469-b40b-47d2-a91c-5e8d4e9df87e" containerName="mariadb-database-create" Oct 14 13:40:53.656293 master-2 kubenswrapper[4762]: I1014 13:40:53.656258 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="b793f469-b40b-47d2-a91c-5e8d4e9df87e" containerName="mariadb-database-create" Oct 14 13:40:53.657562 master-2 kubenswrapper[4762]: I1014 13:40:53.657502 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:53.660431 master-2 kubenswrapper[4762]: I1014 13:40:53.660364 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 14 13:40:53.675684 master-2 kubenswrapper[4762]: I1014 13:40:53.675620 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-094b-account-create-82gt8"] Oct 14 13:40:53.784052 master-2 kubenswrapper[4762]: I1014 13:40:53.783971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snh6b\" (UniqueName: \"kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b\") pod \"octavia-094b-account-create-82gt8\" (UID: \"8a8ac38d-e49d-4b09-a38c-0307bffbf226\") " pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:53.886371 master-2 kubenswrapper[4762]: I1014 13:40:53.886320 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snh6b\" (UniqueName: \"kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b\") pod \"octavia-094b-account-create-82gt8\" (UID: \"8a8ac38d-e49d-4b09-a38c-0307bffbf226\") " pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:53.920915 master-2 kubenswrapper[4762]: I1014 13:40:53.920848 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snh6b\" (UniqueName: \"kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b\") pod \"octavia-094b-account-create-82gt8\" (UID: \"8a8ac38d-e49d-4b09-a38c-0307bffbf226\") " pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:53.981247 master-2 kubenswrapper[4762]: I1014 13:40:53.980436 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:54.526205 master-2 kubenswrapper[4762]: I1014 13:40:54.524613 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-094b-account-create-82gt8"] Oct 14 13:40:54.530581 master-2 kubenswrapper[4762]: W1014 13:40:54.530523 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a8ac38d_e49d_4b09_a38c_0307bffbf226.slice/crio-b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a WatchSource:0}: Error finding container b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a: Status 404 returned error can't find the container with id b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a Oct 14 13:40:54.708321 master-2 kubenswrapper[4762]: I1014 13:40:54.708242 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-094b-account-create-82gt8" event={"ID":"8a8ac38d-e49d-4b09-a38c-0307bffbf226","Type":"ContainerStarted","Data":"f2ba0ea80781847190c2246217570064f177caf036808698d8ece2f4a8274784"} Oct 14 13:40:54.708321 master-2 kubenswrapper[4762]: I1014 13:40:54.708309 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-094b-account-create-82gt8" event={"ID":"8a8ac38d-e49d-4b09-a38c-0307bffbf226","Type":"ContainerStarted","Data":"b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a"} Oct 14 13:40:54.737758 master-2 kubenswrapper[4762]: I1014 13:40:54.737482 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-094b-account-create-82gt8" podStartSLOduration=1.7374484730000002 podStartE2EDuration="1.737448473s" podCreationTimestamp="2025-10-14 13:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:40:54.728650483 +0000 UTC m=+2083.972809652" watchObservedRunningTime="2025-10-14 13:40:54.737448473 +0000 UTC m=+2083.981607632" Oct 14 13:40:55.717751 master-2 kubenswrapper[4762]: I1014 13:40:55.717663 4762 generic.go:334] "Generic (PLEG): container finished" podID="8a8ac38d-e49d-4b09-a38c-0307bffbf226" containerID="f2ba0ea80781847190c2246217570064f177caf036808698d8ece2f4a8274784" exitCode=0 Oct 14 13:40:55.717751 master-2 kubenswrapper[4762]: I1014 13:40:55.717717 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-094b-account-create-82gt8" event={"ID":"8a8ac38d-e49d-4b09-a38c-0307bffbf226","Type":"ContainerDied","Data":"f2ba0ea80781847190c2246217570064f177caf036808698d8ece2f4a8274784"} Oct 14 13:40:57.193567 master-2 kubenswrapper[4762]: I1014 13:40:57.193486 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:57.364323 master-2 kubenswrapper[4762]: I1014 13:40:57.364140 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snh6b\" (UniqueName: \"kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b\") pod \"8a8ac38d-e49d-4b09-a38c-0307bffbf226\" (UID: \"8a8ac38d-e49d-4b09-a38c-0307bffbf226\") " Oct 14 13:40:57.367433 master-2 kubenswrapper[4762]: I1014 13:40:57.367359 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b" (OuterVolumeSpecName: "kube-api-access-snh6b") pod "8a8ac38d-e49d-4b09-a38c-0307bffbf226" (UID: "8a8ac38d-e49d-4b09-a38c-0307bffbf226"). InnerVolumeSpecName "kube-api-access-snh6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:40:57.466503 master-2 kubenswrapper[4762]: I1014 13:40:57.466440 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snh6b\" (UniqueName: \"kubernetes.io/projected/8a8ac38d-e49d-4b09-a38c-0307bffbf226-kube-api-access-snh6b\") on node \"master-2\" DevicePath \"\"" Oct 14 13:40:57.739327 master-2 kubenswrapper[4762]: I1014 13:40:57.739246 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-094b-account-create-82gt8" event={"ID":"8a8ac38d-e49d-4b09-a38c-0307bffbf226","Type":"ContainerDied","Data":"b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a"} Oct 14 13:40:57.739327 master-2 kubenswrapper[4762]: I1014 13:40:57.739310 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-094b-account-create-82gt8" Oct 14 13:40:57.739614 master-2 kubenswrapper[4762]: I1014 13:40:57.739313 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b644f2908364cb98fea071e96c3271da070425cb140603d801c504d2a2a23b2a" Oct 14 13:40:59.585027 master-2 kubenswrapper[4762]: I1014 13:40:59.584936 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:40:59.585777 master-2 kubenswrapper[4762]: E1014 13:40:59.585425 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8ac38d-e49d-4b09-a38c-0307bffbf226" containerName="mariadb-account-create" Oct 14 13:40:59.585777 master-2 kubenswrapper[4762]: I1014 13:40:59.585447 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8ac38d-e49d-4b09-a38c-0307bffbf226" containerName="mariadb-account-create" Oct 14 13:40:59.585777 master-2 kubenswrapper[4762]: I1014 13:40:59.585694 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8ac38d-e49d-4b09-a38c-0307bffbf226" containerName="mariadb-account-create" Oct 14 13:40:59.590128 master-2 kubenswrapper[4762]: I1014 13:40:59.590076 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.616024 master-2 kubenswrapper[4762]: I1014 13:40:59.615947 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 14 13:40:59.616313 master-2 kubenswrapper[4762]: I1014 13:40:59.616147 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 14 13:40:59.616629 master-2 kubenswrapper[4762]: I1014 13:40:59.616596 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Oct 14 13:40:59.631314 master-2 kubenswrapper[4762]: I1014 13:40:59.631203 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:40:59.711222 master-2 kubenswrapper[4762]: I1014 13:40:59.711080 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.711425 master-2 kubenswrapper[4762]: I1014 13:40:59.711344 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.711535 master-2 kubenswrapper[4762]: I1014 13:40:59.711501 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.711685 master-2 kubenswrapper[4762]: I1014 13:40:59.711622 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.711738 master-2 kubenswrapper[4762]: I1014 13:40:59.711721 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.711774 master-2 kubenswrapper[4762]: I1014 13:40:59.711752 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813588 master-2 kubenswrapper[4762]: I1014 13:40:59.813520 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813937 master-2 kubenswrapper[4762]: I1014 13:40:59.813651 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813937 master-2 kubenswrapper[4762]: I1014 13:40:59.813704 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813937 master-2 kubenswrapper[4762]: I1014 13:40:59.813738 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813937 master-2 kubenswrapper[4762]: I1014 13:40:59.813801 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.813937 master-2 kubenswrapper[4762]: I1014 13:40:59.813827 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.817563 master-2 kubenswrapper[4762]: I1014 13:40:59.814188 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.817563 master-2 kubenswrapper[4762]: I1014 13:40:59.814284 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.819868 master-2 kubenswrapper[4762]: I1014 13:40:59.818042 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.819868 master-2 kubenswrapper[4762]: I1014 13:40:59.819445 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.819868 master-2 kubenswrapper[4762]: I1014 13:40:59.819505 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.821688 master-2 kubenswrapper[4762]: I1014 13:40:59.820229 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs\") pod \"octavia-api-6dc54675fc-lvsb4\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:40:59.929831 master-2 kubenswrapper[4762]: I1014 13:40:59.929772 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:00.504976 master-2 kubenswrapper[4762]: I1014 13:41:00.504683 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:41:00.508894 master-2 kubenswrapper[4762]: W1014 13:41:00.508841 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35e45c95_9f29_4266_828c_d8cc7e37c091.slice/crio-0bd6d59d366c9df347f00dc7e90ed0f830024777976083a6fb9cc746ce29f464 WatchSource:0}: Error finding container 0bd6d59d366c9df347f00dc7e90ed0f830024777976083a6fb9cc746ce29f464: Status 404 returned error can't find the container with id 0bd6d59d366c9df347f00dc7e90ed0f830024777976083a6fb9cc746ce29f464 Oct 14 13:41:00.767706 master-2 kubenswrapper[4762]: I1014 13:41:00.767659 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerStarted","Data":"0bd6d59d366c9df347f00dc7e90ed0f830024777976083a6fb9cc746ce29f464"} Oct 14 13:41:09.875482 master-2 kubenswrapper[4762]: I1014 13:41:09.875397 4762 generic.go:334] "Generic (PLEG): container finished" podID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerID="790eec6a1db467198a5c822789307b2b4f5a2ba59c964ef6f73b36113b29ed0c" exitCode=0 Oct 14 13:41:09.875482 master-2 kubenswrapper[4762]: I1014 13:41:09.875488 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerDied","Data":"790eec6a1db467198a5c822789307b2b4f5a2ba59c964ef6f73b36113b29ed0c"} Oct 14 13:41:10.885026 master-2 kubenswrapper[4762]: I1014 13:41:10.884913 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerStarted","Data":"fe814d5c498d9742cc2e790ae17ff7d69ca120a5169e2288d15aa5b40a4de78b"} Oct 14 13:41:10.885026 master-2 kubenswrapper[4762]: I1014 13:41:10.884965 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerStarted","Data":"a819a745f0156344a5c1aceea8457b5a978bf7e08bde3afd077f1b7d88e7e061"} Oct 14 13:41:10.885543 master-2 kubenswrapper[4762]: I1014 13:41:10.885211 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:10.923278 master-2 kubenswrapper[4762]: I1014 13:41:10.923138 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6dc54675fc-lvsb4" podStartSLOduration=2.99100793 podStartE2EDuration="11.923117628s" podCreationTimestamp="2025-10-14 13:40:59 +0000 UTC" firstStartedPulling="2025-10-14 13:41:00.510385666 +0000 UTC m=+2089.754544825" lastFinishedPulling="2025-10-14 13:41:09.442495344 +0000 UTC m=+2098.686654523" observedRunningTime="2025-10-14 13:41:10.92064441 +0000 UTC m=+2100.164803569" watchObservedRunningTime="2025-10-14 13:41:10.923117628 +0000 UTC m=+2100.167276787" Oct 14 13:41:11.894729 master-2 kubenswrapper[4762]: I1014 13:41:11.894637 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:13.060219 master-2 kubenswrapper[4762]: I1014 13:41:13.060096 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:13.061297 master-2 kubenswrapper[4762]: I1014 13:41:13.060644 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-central-agent" containerID="cri-o://6e3d974e860c109c425547ff4853218cb06d7569f5a6c0da651dde5092900269" gracePeriod=30 Oct 14 13:41:13.061297 master-2 kubenswrapper[4762]: I1014 13:41:13.060766 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="sg-core" containerID="cri-o://1c9a03b3d5eb6bb246dd2ebba4e1791b0d240244273ae772094859e45767a627" gracePeriod=30 Oct 14 13:41:13.061297 master-2 kubenswrapper[4762]: I1014 13:41:13.060828 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-notification-agent" containerID="cri-o://acda5e89273912eabb9c1a909e49f0b8b3c272ecfc944377d5ff0358e68140e4" gracePeriod=30 Oct 14 13:41:13.061297 master-2 kubenswrapper[4762]: I1014 13:41:13.061123 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="proxy-httpd" containerID="cri-o://ead9443357c1544acf6c79eb7feb6b791687ab5e9969c2ed98b431ec88f2792e" gracePeriod=30 Oct 14 13:41:13.924672 master-2 kubenswrapper[4762]: I1014 13:41:13.924576 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerID="ead9443357c1544acf6c79eb7feb6b791687ab5e9969c2ed98b431ec88f2792e" exitCode=0 Oct 14 13:41:13.924672 master-2 kubenswrapper[4762]: I1014 13:41:13.924638 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerID="1c9a03b3d5eb6bb246dd2ebba4e1791b0d240244273ae772094859e45767a627" exitCode=2 Oct 14 13:41:13.924672 master-2 kubenswrapper[4762]: I1014 13:41:13.924656 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerID="6e3d974e860c109c425547ff4853218cb06d7569f5a6c0da651dde5092900269" exitCode=0 Oct 14 13:41:13.925019 master-2 kubenswrapper[4762]: I1014 13:41:13.924686 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerDied","Data":"ead9443357c1544acf6c79eb7feb6b791687ab5e9969c2ed98b431ec88f2792e"} Oct 14 13:41:13.925019 master-2 kubenswrapper[4762]: I1014 13:41:13.924727 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerDied","Data":"1c9a03b3d5eb6bb246dd2ebba4e1791b0d240244273ae772094859e45767a627"} Oct 14 13:41:13.925019 master-2 kubenswrapper[4762]: I1014 13:41:13.924748 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerDied","Data":"6e3d974e860c109c425547ff4853218cb06d7569f5a6c0da651dde5092900269"} Oct 14 13:41:18.191977 master-2 kubenswrapper[4762]: I1014 13:41:18.191917 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-nl86n"] Oct 14 13:41:18.193852 master-2 kubenswrapper[4762]: I1014 13:41:18.193818 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.198076 master-2 kubenswrapper[4762]: I1014 13:41:18.198042 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 14 13:41:18.200737 master-2 kubenswrapper[4762]: I1014 13:41:18.200710 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 14 13:41:18.201348 master-2 kubenswrapper[4762]: I1014 13:41:18.201304 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 14 13:41:18.208354 master-2 kubenswrapper[4762]: I1014 13:41:18.208288 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nl86n"] Oct 14 13:41:18.260649 master-2 kubenswrapper[4762]: I1014 13:41:18.260571 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-scripts\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.260649 master-2 kubenswrapper[4762]: I1014 13:41:18.260647 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c022ada2-11d1-49cf-904c-293290d3f201-config-data-merged\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.261211 master-2 kubenswrapper[4762]: I1014 13:41:18.260805 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c022ada2-11d1-49cf-904c-293290d3f201-hm-ports\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.261211 master-2 kubenswrapper[4762]: I1014 13:41:18.260971 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-config-data\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.362822 master-2 kubenswrapper[4762]: I1014 13:41:18.362721 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-config-data\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.363286 master-2 kubenswrapper[4762]: I1014 13:41:18.362864 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-scripts\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.363286 master-2 kubenswrapper[4762]: I1014 13:41:18.362960 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c022ada2-11d1-49cf-904c-293290d3f201-config-data-merged\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.363286 master-2 kubenswrapper[4762]: I1014 13:41:18.363078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c022ada2-11d1-49cf-904c-293290d3f201-hm-ports\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.364083 master-2 kubenswrapper[4762]: I1014 13:41:18.364006 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c022ada2-11d1-49cf-904c-293290d3f201-config-data-merged\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.365479 master-2 kubenswrapper[4762]: I1014 13:41:18.365427 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c022ada2-11d1-49cf-904c-293290d3f201-hm-ports\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.368780 master-2 kubenswrapper[4762]: I1014 13:41:18.368710 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-config-data\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.369477 master-2 kubenswrapper[4762]: I1014 13:41:18.369428 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c022ada2-11d1-49cf-904c-293290d3f201-scripts\") pod \"octavia-rsyslog-nl86n\" (UID: \"c022ada2-11d1-49cf-904c-293290d3f201\") " pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:18.528787 master-2 kubenswrapper[4762]: I1014 13:41:18.528613 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:19.475179 master-2 kubenswrapper[4762]: I1014 13:41:19.466398 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:41:19.475179 master-2 kubenswrapper[4762]: I1014 13:41:19.468624 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.475179 master-2 kubenswrapper[4762]: I1014 13:41:19.473343 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 14 13:41:19.487193 master-2 kubenswrapper[4762]: I1014 13:41:19.484079 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:41:19.586959 master-2 kubenswrapper[4762]: I1014 13:41:19.586879 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.587391 master-2 kubenswrapper[4762]: I1014 13:41:19.587304 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.689611 master-2 kubenswrapper[4762]: I1014 13:41:19.689540 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.689769 master-2 kubenswrapper[4762]: I1014 13:41:19.689738 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.690097 master-2 kubenswrapper[4762]: I1014 13:41:19.690061 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.693885 master-2 kubenswrapper[4762]: I1014 13:41:19.693294 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config\") pod \"octavia-image-upload-678599687f-hwhwd\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.795549 master-2 kubenswrapper[4762]: I1014 13:41:19.795466 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:41:19.840597 master-2 kubenswrapper[4762]: I1014 13:41:19.840531 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-nl86n"] Oct 14 13:41:19.846821 master-2 kubenswrapper[4762]: W1014 13:41:19.846074 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc022ada2_11d1_49cf_904c_293290d3f201.slice/crio-f6b2c47ce4d70c6aaea8492d1221adab715ddbe5654498b809c27680e1fd780d WatchSource:0}: Error finding container f6b2c47ce4d70c6aaea8492d1221adab715ddbe5654498b809c27680e1fd780d: Status 404 returned error can't find the container with id f6b2c47ce4d70c6aaea8492d1221adab715ddbe5654498b809c27680e1fd780d Oct 14 13:41:20.001632 master-2 kubenswrapper[4762]: I1014 13:41:20.001571 4762 generic.go:334] "Generic (PLEG): container finished" podID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerID="acda5e89273912eabb9c1a909e49f0b8b3c272ecfc944377d5ff0358e68140e4" exitCode=0 Oct 14 13:41:20.001859 master-2 kubenswrapper[4762]: I1014 13:41:20.001698 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerDied","Data":"acda5e89273912eabb9c1a909e49f0b8b3c272ecfc944377d5ff0358e68140e4"} Oct 14 13:41:20.007785 master-2 kubenswrapper[4762]: I1014 13:41:20.007735 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nl86n" event={"ID":"c022ada2-11d1-49cf-904c-293290d3f201","Type":"ContainerStarted","Data":"f6b2c47ce4d70c6aaea8492d1221adab715ddbe5654498b809c27680e1fd780d"} Oct 14 13:41:20.102387 master-2 kubenswrapper[4762]: I1014 13:41:20.102261 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:20.214027 master-2 kubenswrapper[4762]: I1014 13:41:20.213910 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214071 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214115 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214192 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214228 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214262 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkvlk\" (UniqueName: \"kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214335 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.214417 master-2 kubenswrapper[4762]: I1014 13:41:20.214378 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs\") pod \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\" (UID: \"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0\") " Oct 14 13:41:20.215554 master-2 kubenswrapper[4762]: I1014 13:41:20.215295 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:20.215554 master-2 kubenswrapper[4762]: I1014 13:41:20.215350 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:20.218057 master-2 kubenswrapper[4762]: I1014 13:41:20.217979 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk" (OuterVolumeSpecName: "kube-api-access-mkvlk") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "kube-api-access-mkvlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:41:20.218468 master-2 kubenswrapper[4762]: I1014 13:41:20.218426 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts" (OuterVolumeSpecName: "scripts") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:20.249761 master-2 kubenswrapper[4762]: I1014 13:41:20.248340 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:20.283581 master-2 kubenswrapper[4762]: I1014 13:41:20.283459 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:20.294402 master-2 kubenswrapper[4762]: W1014 13:41:20.294343 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd42f472f_fadd_471b_be3d_cfda97a7e407.slice/crio-cee7e67f1de649810aca0a0f0ff3ee7acc72016b20c416d179925fe5750744aa WatchSource:0}: Error finding container cee7e67f1de649810aca0a0f0ff3ee7acc72016b20c416d179925fe5750744aa: Status 404 returned error can't find the container with id cee7e67f1de649810aca0a0f0ff3ee7acc72016b20c416d179925fe5750744aa Oct 14 13:41:20.296473 master-2 kubenswrapper[4762]: I1014 13:41:20.296423 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:41:20.303606 master-2 kubenswrapper[4762]: I1014 13:41:20.303463 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data" (OuterVolumeSpecName: "config-data") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:20.311794 master-2 kubenswrapper[4762]: I1014 13:41:20.311746 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" (UID: "ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:20.317110 master-2 kubenswrapper[4762]: I1014 13:41:20.317069 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317110 master-2 kubenswrapper[4762]: I1014 13:41:20.317100 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317110 master-2 kubenswrapper[4762]: I1014 13:41:20.317112 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkvlk\" (UniqueName: \"kubernetes.io/projected/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-kube-api-access-mkvlk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317281 master-2 kubenswrapper[4762]: I1014 13:41:20.317125 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317281 master-2 kubenswrapper[4762]: I1014 13:41:20.317135 4762 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-ceilometer-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317281 master-2 kubenswrapper[4762]: I1014 13:41:20.317144 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317281 master-2 kubenswrapper[4762]: I1014 13:41:20.317152 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.317281 master-2 kubenswrapper[4762]: I1014 13:41:20.317174 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:20.493370 master-2 kubenswrapper[4762]: I1014 13:41:20.493275 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-c564bc7f-jdbq7"] Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: E1014 13:41:20.493721 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="proxy-httpd" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: I1014 13:41:20.493738 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="proxy-httpd" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: E1014 13:41:20.493748 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="sg-core" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: I1014 13:41:20.493754 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="sg-core" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: E1014 13:41:20.493771 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-notification-agent" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: I1014 13:41:20.493777 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-notification-agent" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: E1014 13:41:20.493793 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-central-agent" Oct 14 13:41:20.493872 master-2 kubenswrapper[4762]: I1014 13:41:20.493799 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-central-agent" Oct 14 13:41:20.494108 master-2 kubenswrapper[4762]: I1014 13:41:20.494046 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="proxy-httpd" Oct 14 13:41:20.494108 master-2 kubenswrapper[4762]: I1014 13:41:20.494060 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="sg-core" Oct 14 13:41:20.494108 master-2 kubenswrapper[4762]: I1014 13:41:20.494072 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-central-agent" Oct 14 13:41:20.494108 master-2 kubenswrapper[4762]: I1014 13:41:20.494082 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" containerName="ceilometer-notification-agent" Oct 14 13:41:20.495535 master-2 kubenswrapper[4762]: I1014 13:41:20.495512 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.503337 master-2 kubenswrapper[4762]: I1014 13:41:20.503272 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Oct 14 13:41:20.503495 master-2 kubenswrapper[4762]: I1014 13:41:20.503433 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Oct 14 13:41:20.515056 master-2 kubenswrapper[4762]: I1014 13:41:20.515010 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-c564bc7f-jdbq7"] Oct 14 13:41:20.622763 master-2 kubenswrapper[4762]: I1014 13:41:20.622647 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.622763 master-2 kubenswrapper[4762]: I1014 13:41:20.622712 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data-merged\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623001 master-2 kubenswrapper[4762]: I1014 13:41:20.622762 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-public-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623001 master-2 kubenswrapper[4762]: I1014 13:41:20.622794 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-scripts\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623087 master-2 kubenswrapper[4762]: I1014 13:41:20.623018 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-octavia-run\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623178 master-2 kubenswrapper[4762]: I1014 13:41:20.623139 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-ovndb-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623307 master-2 kubenswrapper[4762]: I1014 13:41:20.623279 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-internal-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.623618 master-2 kubenswrapper[4762]: I1014 13:41:20.623505 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-combined-ca-bundle\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726539 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726640 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data-merged\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726726 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-public-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-scripts\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726832 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-octavia-run\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726885 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-ovndb-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.726948 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-internal-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727240 master-2 kubenswrapper[4762]: I1014 13:41:20.727005 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-combined-ca-bundle\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727978 master-2 kubenswrapper[4762]: I1014 13:41:20.727555 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data-merged\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.727978 master-2 kubenswrapper[4762]: I1014 13:41:20.727577 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-octavia-run\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.730591 master-2 kubenswrapper[4762]: I1014 13:41:20.730544 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-scripts\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.731607 master-2 kubenswrapper[4762]: I1014 13:41:20.731549 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-config-data\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.731908 master-2 kubenswrapper[4762]: I1014 13:41:20.731826 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-ovndb-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.733504 master-2 kubenswrapper[4762]: I1014 13:41:20.732117 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-public-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.733504 master-2 kubenswrapper[4762]: I1014 13:41:20.732988 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-internal-tls-certs\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.733504 master-2 kubenswrapper[4762]: I1014 13:41:20.733438 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d7a5b7-37c0-45b0-bcd2-a948bf2be235-combined-ca-bundle\") pod \"octavia-api-c564bc7f-jdbq7\" (UID: \"49d7a5b7-37c0-45b0-bcd2-a948bf2be235\") " pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:20.876138 master-2 kubenswrapper[4762]: I1014 13:41:20.875923 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:21.046092 master-2 kubenswrapper[4762]: I1014 13:41:21.046000 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0","Type":"ContainerDied","Data":"a600589d404ff53bf340616ae1bab4f94d4f8b89429306510293a23eaa2305ab"} Oct 14 13:41:21.046092 master-2 kubenswrapper[4762]: I1014 13:41:21.046082 4762 scope.go:117] "RemoveContainer" containerID="ead9443357c1544acf6c79eb7feb6b791687ab5e9969c2ed98b431ec88f2792e" Oct 14 13:41:21.051061 master-2 kubenswrapper[4762]: I1014 13:41:21.046032 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:21.054677 master-2 kubenswrapper[4762]: I1014 13:41:21.054324 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hwhwd" event={"ID":"d42f472f-fadd-471b-be3d-cfda97a7e407","Type":"ContainerStarted","Data":"cee7e67f1de649810aca0a0f0ff3ee7acc72016b20c416d179925fe5750744aa"} Oct 14 13:41:21.120216 master-2 kubenswrapper[4762]: I1014 13:41:21.120141 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:21.144707 master-2 kubenswrapper[4762]: I1014 13:41:21.144581 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:21.189530 master-2 kubenswrapper[4762]: I1014 13:41:21.189491 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:21.193446 master-2 kubenswrapper[4762]: I1014 13:41:21.193417 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:21.196592 master-2 kubenswrapper[4762]: I1014 13:41:21.196547 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:41:21.196803 master-2 kubenswrapper[4762]: I1014 13:41:21.196778 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:41:21.196934 master-2 kubenswrapper[4762]: I1014 13:41:21.196911 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:41:21.244446 master-2 kubenswrapper[4762]: I1014 13:41:21.244403 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:21.341676 master-2 kubenswrapper[4762]: I1014 13:41:21.341593 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.341851 master-2 kubenswrapper[4762]: I1014 13:41:21.341711 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342111 master-2 kubenswrapper[4762]: I1014 13:41:21.342062 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342301 master-2 kubenswrapper[4762]: I1014 13:41:21.342270 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxrc\" (UniqueName: \"kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342477 master-2 kubenswrapper[4762]: I1014 13:41:21.342448 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342540 master-2 kubenswrapper[4762]: I1014 13:41:21.342491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342593 master-2 kubenswrapper[4762]: I1014 13:41:21.342569 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.342730 master-2 kubenswrapper[4762]: I1014 13:41:21.342705 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.414246 master-2 kubenswrapper[4762]: I1014 13:41:21.413856 4762 scope.go:117] "RemoveContainer" containerID="1c9a03b3d5eb6bb246dd2ebba4e1791b0d240244273ae772094859e45767a627" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444489 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444568 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444625 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444652 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444687 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444728 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444813 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445518 master-2 kubenswrapper[4762]: I1014 13:41:21.444875 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxrc\" (UniqueName: \"kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445824 master-2 kubenswrapper[4762]: I1014 13:41:21.445771 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.445985 master-2 kubenswrapper[4762]: I1014 13:41:21.445953 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.455719 master-2 kubenswrapper[4762]: I1014 13:41:21.454409 4762 scope.go:117] "RemoveContainer" containerID="acda5e89273912eabb9c1a909e49f0b8b3c272ecfc944377d5ff0358e68140e4" Oct 14 13:41:21.455719 master-2 kubenswrapper[4762]: I1014 13:41:21.455599 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.455719 master-2 kubenswrapper[4762]: I1014 13:41:21.455701 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.456149 master-2 kubenswrapper[4762]: I1014 13:41:21.456078 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.456477 master-2 kubenswrapper[4762]: I1014 13:41:21.456449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.460139 master-2 kubenswrapper[4762]: I1014 13:41:21.459876 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.474958 master-2 kubenswrapper[4762]: I1014 13:41:21.474813 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxrc\" (UniqueName: \"kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc\") pod \"ceilometer-0\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " pod="openstack/ceilometer-0" Oct 14 13:41:21.496068 master-2 kubenswrapper[4762]: I1014 13:41:21.495818 4762 scope.go:117] "RemoveContainer" containerID="6e3d974e860c109c425547ff4853218cb06d7569f5a6c0da651dde5092900269" Oct 14 13:41:21.515078 master-2 kubenswrapper[4762]: I1014 13:41:21.515024 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:21.570533 master-2 kubenswrapper[4762]: I1014 13:41:21.570216 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0" path="/var/lib/kubelet/pods/ae5b4b0d-66c2-4e55-b23e-0f29af1a80b0/volumes" Oct 14 13:41:21.585318 master-2 kubenswrapper[4762]: I1014 13:41:21.585267 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-c564bc7f-jdbq7"] Oct 14 13:41:21.590183 master-2 kubenswrapper[4762]: W1014 13:41:21.589623 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d7a5b7_37c0_45b0_bcd2_a948bf2be235.slice/crio-4c57695436f8008ac5de48b6a35efce09352b0097d2da8addd4fa92ff3586eac WatchSource:0}: Error finding container 4c57695436f8008ac5de48b6a35efce09352b0097d2da8addd4fa92ff3586eac: Status 404 returned error can't find the container with id 4c57695436f8008ac5de48b6a35efce09352b0097d2da8addd4fa92ff3586eac Oct 14 13:41:22.100103 master-2 kubenswrapper[4762]: I1014 13:41:22.100032 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:22.104948 master-2 kubenswrapper[4762]: I1014 13:41:22.104444 4762 generic.go:334] "Generic (PLEG): container finished" podID="49d7a5b7-37c0-45b0-bcd2-a948bf2be235" containerID="f3b0fa51ef5fc2e224afbb14967b6189390fcc4c451efb64f0fd715cc8dac9c7" exitCode=0 Oct 14 13:41:22.104948 master-2 kubenswrapper[4762]: I1014 13:41:22.104533 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c564bc7f-jdbq7" event={"ID":"49d7a5b7-37c0-45b0-bcd2-a948bf2be235","Type":"ContainerDied","Data":"f3b0fa51ef5fc2e224afbb14967b6189390fcc4c451efb64f0fd715cc8dac9c7"} Oct 14 13:41:22.104948 master-2 kubenswrapper[4762]: I1014 13:41:22.104564 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c564bc7f-jdbq7" event={"ID":"49d7a5b7-37c0-45b0-bcd2-a948bf2be235","Type":"ContainerStarted","Data":"4c57695436f8008ac5de48b6a35efce09352b0097d2da8addd4fa92ff3586eac"} Oct 14 13:41:22.130615 master-2 kubenswrapper[4762]: W1014 13:41:22.130562 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bdc20eb_9391_4fac_84ee_243d246d6d4d.slice/crio-5187788177d3cd8858fb44918152b32b2d5e12d469d5f73258827d18475f97c3 WatchSource:0}: Error finding container 5187788177d3cd8858fb44918152b32b2d5e12d469d5f73258827d18475f97c3: Status 404 returned error can't find the container with id 5187788177d3cd8858fb44918152b32b2d5e12d469d5f73258827d18475f97c3 Oct 14 13:41:22.296745 master-2 kubenswrapper[4762]: I1014 13:41:22.296665 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:23.126864 master-2 kubenswrapper[4762]: I1014 13:41:23.126658 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerStarted","Data":"5187788177d3cd8858fb44918152b32b2d5e12d469d5f73258827d18475f97c3"} Oct 14 13:41:23.130008 master-2 kubenswrapper[4762]: I1014 13:41:23.129953 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c564bc7f-jdbq7" event={"ID":"49d7a5b7-37c0-45b0-bcd2-a948bf2be235","Type":"ContainerStarted","Data":"d37025e2c9b28363d00a2abcc8b57e079dd80701940836c1fa2aec73f1d9db3d"} Oct 14 13:41:23.130115 master-2 kubenswrapper[4762]: I1014 13:41:23.130027 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c564bc7f-jdbq7" event={"ID":"49d7a5b7-37c0-45b0-bcd2-a948bf2be235","Type":"ContainerStarted","Data":"65f47fd63d00159756ce05d7841bd1c6fc5a8b09d78b3ed8727b10592d571e4b"} Oct 14 13:41:23.131323 master-2 kubenswrapper[4762]: I1014 13:41:23.131284 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:23.131401 master-2 kubenswrapper[4762]: I1014 13:41:23.131346 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:23.132929 master-2 kubenswrapper[4762]: I1014 13:41:23.132866 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nl86n" event={"ID":"c022ada2-11d1-49cf-904c-293290d3f201","Type":"ContainerStarted","Data":"1c25b86cb4efc13dfebb0972beb8fef1ffaeb11180c188baaa1fecf04c3e1099"} Oct 14 13:41:23.227219 master-2 kubenswrapper[4762]: I1014 13:41:23.227080 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-c564bc7f-jdbq7" podStartSLOduration=3.227053415 podStartE2EDuration="3.227053415s" podCreationTimestamp="2025-10-14 13:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:41:23.201622949 +0000 UTC m=+2112.445782158" watchObservedRunningTime="2025-10-14 13:41:23.227053415 +0000 UTC m=+2112.471212574" Oct 14 13:41:23.293137 master-2 kubenswrapper[4762]: I1014 13:41:23.293102 4762 scope.go:117] "RemoveContainer" containerID="28f1251bdd543d9279723f209f9af929cafbfc888787b806329edf838eed9e3b" Oct 14 13:41:24.144132 master-2 kubenswrapper[4762]: I1014 13:41:24.144054 4762 generic.go:334] "Generic (PLEG): container finished" podID="c022ada2-11d1-49cf-904c-293290d3f201" containerID="1c25b86cb4efc13dfebb0972beb8fef1ffaeb11180c188baaa1fecf04c3e1099" exitCode=0 Oct 14 13:41:24.144711 master-2 kubenswrapper[4762]: I1014 13:41:24.144143 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nl86n" event={"ID":"c022ada2-11d1-49cf-904c-293290d3f201","Type":"ContainerDied","Data":"1c25b86cb4efc13dfebb0972beb8fef1ffaeb11180c188baaa1fecf04c3e1099"} Oct 14 13:41:24.146871 master-2 kubenswrapper[4762]: I1014 13:41:24.146789 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerStarted","Data":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} Oct 14 13:41:25.255196 master-2 kubenswrapper[4762]: I1014 13:41:25.247723 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerStarted","Data":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} Oct 14 13:41:26.261299 master-2 kubenswrapper[4762]: I1014 13:41:26.260989 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerStarted","Data":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} Oct 14 13:41:26.264388 master-2 kubenswrapper[4762]: I1014 13:41:26.264342 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-nl86n" event={"ID":"c022ada2-11d1-49cf-904c-293290d3f201","Type":"ContainerStarted","Data":"e1de9b602f34ea6f0f73a6bb52d3c569b98ffddaa355a78c998bb7b734afc24e"} Oct 14 13:41:26.264606 master-2 kubenswrapper[4762]: I1014 13:41:26.264570 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:26.297210 master-2 kubenswrapper[4762]: I1014 13:41:26.297045 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-nl86n" podStartSLOduration=2.377143537 podStartE2EDuration="8.297025781s" podCreationTimestamp="2025-10-14 13:41:18 +0000 UTC" firstStartedPulling="2025-10-14 13:41:19.850167826 +0000 UTC m=+2109.094326985" lastFinishedPulling="2025-10-14 13:41:25.77005007 +0000 UTC m=+2115.014209229" observedRunningTime="2025-10-14 13:41:26.294329725 +0000 UTC m=+2115.538488904" watchObservedRunningTime="2025-10-14 13:41:26.297025781 +0000 UTC m=+2115.541184940" Oct 14 13:41:27.895406 master-2 kubenswrapper[4762]: I1014 13:41:27.895334 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-r24zb"] Oct 14 13:41:27.898812 master-2 kubenswrapper[4762]: I1014 13:41:27.898789 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:27.921840 master-2 kubenswrapper[4762]: I1014 13:41:27.921766 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 14 13:41:28.021889 master-2 kubenswrapper[4762]: I1014 13:41:28.021777 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.022471 master-2 kubenswrapper[4762]: I1014 13:41:28.022444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.022632 master-2 kubenswrapper[4762]: I1014 13:41:28.022617 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.022849 master-2 kubenswrapper[4762]: I1014 13:41:28.022830 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.125596 master-2 kubenswrapper[4762]: I1014 13:41:28.125503 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.125596 master-2 kubenswrapper[4762]: I1014 13:41:28.125606 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.126044 master-2 kubenswrapper[4762]: I1014 13:41:28.125646 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.126044 master-2 kubenswrapper[4762]: I1014 13:41:28.125711 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.126372 master-2 kubenswrapper[4762]: I1014 13:41:28.126257 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.129728 master-2 kubenswrapper[4762]: I1014 13:41:28.129684 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.131189 master-2 kubenswrapper[4762]: I1014 13:41:28.131107 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.132136 master-2 kubenswrapper[4762]: I1014 13:41:28.132094 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle\") pod \"octavia-db-sync-r24zb\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.225196 master-2 kubenswrapper[4762]: I1014 13:41:28.225120 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:28.333507 master-2 kubenswrapper[4762]: I1014 13:41:28.333424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-r24zb"] Oct 14 13:41:28.897132 master-2 kubenswrapper[4762]: I1014 13:41:28.891193 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-r24zb"] Oct 14 13:41:28.897132 master-2 kubenswrapper[4762]: W1014 13:41:28.894227 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e89c7d_f953_4fa3_95af_2abba3a06439.slice/crio-5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933 WatchSource:0}: Error finding container 5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933: Status 404 returned error can't find the container with id 5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933 Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310837 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerStarted","Data":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310869 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-central-agent" containerID="cri-o://f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" gracePeriod=30 Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310971 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-notification-agent" containerID="cri-o://d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" gracePeriod=30 Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310983 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="sg-core" containerID="cri-o://c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" gracePeriod=30 Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310931 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:41:29.311225 master-2 kubenswrapper[4762]: I1014 13:41:29.310934 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="proxy-httpd" containerID="cri-o://940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" gracePeriod=30 Oct 14 13:41:29.314261 master-2 kubenswrapper[4762]: I1014 13:41:29.314228 4762 generic.go:334] "Generic (PLEG): container finished" podID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerID="abde3099d7ce6ea462d0be2ae870df7f40c0a6f23f58bc92df4c9d971cc6bd53" exitCode=0 Oct 14 13:41:29.314327 master-2 kubenswrapper[4762]: I1014 13:41:29.314270 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r24zb" event={"ID":"55e89c7d-f953-4fa3-95af-2abba3a06439","Type":"ContainerDied","Data":"abde3099d7ce6ea462d0be2ae870df7f40c0a6f23f58bc92df4c9d971cc6bd53"} Oct 14 13:41:29.314327 master-2 kubenswrapper[4762]: I1014 13:41:29.314308 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r24zb" event={"ID":"55e89c7d-f953-4fa3-95af-2abba3a06439","Type":"ContainerStarted","Data":"5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933"} Oct 14 13:41:29.352426 master-2 kubenswrapper[4762]: I1014 13:41:29.352321 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.198661255 podStartE2EDuration="8.35226392s" podCreationTimestamp="2025-10-14 13:41:21 +0000 UTC" firstStartedPulling="2025-10-14 13:41:22.269058685 +0000 UTC m=+2111.513217854" lastFinishedPulling="2025-10-14 13:41:28.42266135 +0000 UTC m=+2117.666820519" observedRunningTime="2025-10-14 13:41:29.340436895 +0000 UTC m=+2118.584596054" watchObservedRunningTime="2025-10-14 13:41:29.35226392 +0000 UTC m=+2118.596423079" Oct 14 13:41:30.338523 master-2 kubenswrapper[4762]: I1014 13:41:30.338450 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:30.343242 master-2 kubenswrapper[4762]: I1014 13:41:30.343180 4762 generic.go:334] "Generic (PLEG): container finished" podID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" exitCode=0 Oct 14 13:41:30.343377 master-2 kubenswrapper[4762]: I1014 13:41:30.343267 4762 generic.go:334] "Generic (PLEG): container finished" podID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" exitCode=2 Oct 14 13:41:30.343377 master-2 kubenswrapper[4762]: I1014 13:41:30.343284 4762 generic.go:334] "Generic (PLEG): container finished" podID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" exitCode=0 Oct 14 13:41:30.343377 master-2 kubenswrapper[4762]: I1014 13:41:30.343297 4762 generic.go:334] "Generic (PLEG): container finished" podID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" exitCode=0 Oct 14 13:41:30.343556 master-2 kubenswrapper[4762]: I1014 13:41:30.343469 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerDied","Data":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} Oct 14 13:41:30.343655 master-2 kubenswrapper[4762]: I1014 13:41:30.343556 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerDied","Data":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} Oct 14 13:41:30.343655 master-2 kubenswrapper[4762]: I1014 13:41:30.343582 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerDied","Data":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} Oct 14 13:41:30.343655 master-2 kubenswrapper[4762]: I1014 13:41:30.343601 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerDied","Data":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} Oct 14 13:41:30.343655 master-2 kubenswrapper[4762]: I1014 13:41:30.343659 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1bdc20eb-9391-4fac-84ee-243d246d6d4d","Type":"ContainerDied","Data":"5187788177d3cd8858fb44918152b32b2d5e12d469d5f73258827d18475f97c3"} Oct 14 13:41:30.344998 master-2 kubenswrapper[4762]: I1014 13:41:30.344851 4762 scope.go:117] "RemoveContainer" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.351425 master-2 kubenswrapper[4762]: I1014 13:41:30.351206 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r24zb" event={"ID":"55e89c7d-f953-4fa3-95af-2abba3a06439","Type":"ContainerStarted","Data":"a74c81431913c6f9f5e45c4fb724bdd236fda3473562ecb486a8725ea8ac0e23"} Oct 14 13:41:30.373593 master-2 kubenswrapper[4762]: I1014 13:41:30.373547 4762 scope.go:117] "RemoveContainer" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.401105 master-2 kubenswrapper[4762]: I1014 13:41:30.400930 4762 scope.go:117] "RemoveContainer" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.435444 master-2 kubenswrapper[4762]: I1014 13:41:30.435402 4762 scope.go:117] "RemoveContainer" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.458537 master-2 kubenswrapper[4762]: I1014 13:41:30.458483 4762 scope.go:117] "RemoveContainer" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.459089 master-2 kubenswrapper[4762]: E1014 13:41:30.459034 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": container with ID starting with 940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2 not found: ID does not exist" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.459162 master-2 kubenswrapper[4762]: I1014 13:41:30.459103 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} err="failed to get container status \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": rpc error: code = NotFound desc = could not find container \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": container with ID starting with 940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2 not found: ID does not exist" Oct 14 13:41:30.459254 master-2 kubenswrapper[4762]: I1014 13:41:30.459141 4762 scope.go:117] "RemoveContainer" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.459720 master-2 kubenswrapper[4762]: E1014 13:41:30.459674 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": container with ID starting with c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373 not found: ID does not exist" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.459804 master-2 kubenswrapper[4762]: I1014 13:41:30.459719 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} err="failed to get container status \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": rpc error: code = NotFound desc = could not find container \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": container with ID starting with c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373 not found: ID does not exist" Oct 14 13:41:30.459804 master-2 kubenswrapper[4762]: I1014 13:41:30.459747 4762 scope.go:117] "RemoveContainer" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.460927 master-2 kubenswrapper[4762]: E1014 13:41:30.460867 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": container with ID starting with d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365 not found: ID does not exist" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.460927 master-2 kubenswrapper[4762]: I1014 13:41:30.460902 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} err="failed to get container status \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": rpc error: code = NotFound desc = could not find container \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": container with ID starting with d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365 not found: ID does not exist" Oct 14 13:41:30.460927 master-2 kubenswrapper[4762]: I1014 13:41:30.460923 4762 scope.go:117] "RemoveContainer" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.461322 master-2 kubenswrapper[4762]: E1014 13:41:30.461281 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": container with ID starting with f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422 not found: ID does not exist" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.461380 master-2 kubenswrapper[4762]: I1014 13:41:30.461338 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} err="failed to get container status \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": rpc error: code = NotFound desc = could not find container \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": container with ID starting with f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422 not found: ID does not exist" Oct 14 13:41:30.461380 master-2 kubenswrapper[4762]: I1014 13:41:30.461374 4762 scope.go:117] "RemoveContainer" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.461992 master-2 kubenswrapper[4762]: I1014 13:41:30.461948 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} err="failed to get container status \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": rpc error: code = NotFound desc = could not find container \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": container with ID starting with 940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2 not found: ID does not exist" Oct 14 13:41:30.461992 master-2 kubenswrapper[4762]: I1014 13:41:30.461985 4762 scope.go:117] "RemoveContainer" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.462307 master-2 kubenswrapper[4762]: I1014 13:41:30.462266 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} err="failed to get container status \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": rpc error: code = NotFound desc = could not find container \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": container with ID starting with c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373 not found: ID does not exist" Oct 14 13:41:30.462307 master-2 kubenswrapper[4762]: I1014 13:41:30.462286 4762 scope.go:117] "RemoveContainer" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.462754 master-2 kubenswrapper[4762]: I1014 13:41:30.462676 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} err="failed to get container status \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": rpc error: code = NotFound desc = could not find container \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": container with ID starting with d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365 not found: ID does not exist" Oct 14 13:41:30.462754 master-2 kubenswrapper[4762]: I1014 13:41:30.462698 4762 scope.go:117] "RemoveContainer" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.463777 master-2 kubenswrapper[4762]: I1014 13:41:30.463349 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} err="failed to get container status \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": rpc error: code = NotFound desc = could not find container \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": container with ID starting with f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422 not found: ID does not exist" Oct 14 13:41:30.463777 master-2 kubenswrapper[4762]: I1014 13:41:30.463496 4762 scope.go:117] "RemoveContainer" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.464069 master-2 kubenswrapper[4762]: I1014 13:41:30.464027 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} err="failed to get container status \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": rpc error: code = NotFound desc = could not find container \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": container with ID starting with 940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2 not found: ID does not exist" Oct 14 13:41:30.464069 master-2 kubenswrapper[4762]: I1014 13:41:30.464058 4762 scope.go:117] "RemoveContainer" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.466106 master-2 kubenswrapper[4762]: I1014 13:41:30.465943 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} err="failed to get container status \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": rpc error: code = NotFound desc = could not find container \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": container with ID starting with c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373 not found: ID does not exist" Oct 14 13:41:30.466106 master-2 kubenswrapper[4762]: I1014 13:41:30.465981 4762 scope.go:117] "RemoveContainer" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.466346 master-2 kubenswrapper[4762]: I1014 13:41:30.466312 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} err="failed to get container status \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": rpc error: code = NotFound desc = could not find container \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": container with ID starting with d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365 not found: ID does not exist" Oct 14 13:41:30.466346 master-2 kubenswrapper[4762]: I1014 13:41:30.466344 4762 scope.go:117] "RemoveContainer" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.466902 master-2 kubenswrapper[4762]: I1014 13:41:30.466800 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} err="failed to get container status \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": rpc error: code = NotFound desc = could not find container \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": container with ID starting with f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422 not found: ID does not exist" Oct 14 13:41:30.466902 master-2 kubenswrapper[4762]: I1014 13:41:30.466826 4762 scope.go:117] "RemoveContainer" containerID="940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2" Oct 14 13:41:30.467169 master-2 kubenswrapper[4762]: I1014 13:41:30.467121 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2"} err="failed to get container status \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": rpc error: code = NotFound desc = could not find container \"940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2\": container with ID starting with 940d3697b0655dc74e009502872049035daacf4467864fdb411dbdc015b73ed2 not found: ID does not exist" Oct 14 13:41:30.467221 master-2 kubenswrapper[4762]: I1014 13:41:30.467167 4762 scope.go:117] "RemoveContainer" containerID="c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373" Oct 14 13:41:30.467679 master-2 kubenswrapper[4762]: I1014 13:41:30.467610 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373"} err="failed to get container status \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": rpc error: code = NotFound desc = could not find container \"c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373\": container with ID starting with c56cb253ccf46b6cee587e34ce87b8c4d33c5e4e643c47bcb178c0cf35e0a373 not found: ID does not exist" Oct 14 13:41:30.467679 master-2 kubenswrapper[4762]: I1014 13:41:30.467641 4762 scope.go:117] "RemoveContainer" containerID="d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365" Oct 14 13:41:30.468054 master-2 kubenswrapper[4762]: I1014 13:41:30.467981 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365"} err="failed to get container status \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": rpc error: code = NotFound desc = could not find container \"d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365\": container with ID starting with d4ed29c8b457a897a3935eaef1e6558797987517e3b453962f50ecc4e7e79365 not found: ID does not exist" Oct 14 13:41:30.468127 master-2 kubenswrapper[4762]: I1014 13:41:30.468055 4762 scope.go:117] "RemoveContainer" containerID="f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422" Oct 14 13:41:30.468533 master-2 kubenswrapper[4762]: I1014 13:41:30.468497 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422"} err="failed to get container status \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": rpc error: code = NotFound desc = could not find container \"f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422\": container with ID starting with f98bb0911443e43f33ea9e1cbf5a914001044976ddc0ca3d5ed58798906fc422 not found: ID does not exist" Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516099 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516241 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516311 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516384 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516423 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516450 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516522 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.516677 master-2 kubenswrapper[4762]: I1014 13:41:30.516545 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcxrc\" (UniqueName: \"kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc\") pod \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\" (UID: \"1bdc20eb-9391-4fac-84ee-243d246d6d4d\") " Oct 14 13:41:30.519038 master-2 kubenswrapper[4762]: I1014 13:41:30.518989 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:30.519229 master-2 kubenswrapper[4762]: I1014 13:41:30.519191 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:30.576040 master-2 kubenswrapper[4762]: I1014 13:41:30.575266 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc" (OuterVolumeSpecName: "kube-api-access-zcxrc") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "kube-api-access-zcxrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:41:30.586333 master-2 kubenswrapper[4762]: I1014 13:41:30.584063 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:30.592624 master-2 kubenswrapper[4762]: I1014 13:41:30.592564 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts" (OuterVolumeSpecName: "scripts") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:30.618045 master-2 kubenswrapper[4762]: I1014 13:41:30.617967 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.618045 master-2 kubenswrapper[4762]: I1014 13:41:30.618012 4762 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-sg-core-conf-yaml\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.618045 master-2 kubenswrapper[4762]: I1014 13:41:30.618026 4762 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-run-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.618045 master-2 kubenswrapper[4762]: I1014 13:41:30.618038 4762 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1bdc20eb-9391-4fac-84ee-243d246d6d4d-log-httpd\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.618045 master-2 kubenswrapper[4762]: I1014 13:41:30.618053 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcxrc\" (UniqueName: \"kubernetes.io/projected/1bdc20eb-9391-4fac-84ee-243d246d6d4d-kube-api-access-zcxrc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.625783 master-2 kubenswrapper[4762]: I1014 13:41:30.625705 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:30.648329 master-2 kubenswrapper[4762]: I1014 13:41:30.648238 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:30.654060 master-2 kubenswrapper[4762]: I1014 13:41:30.654002 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data" (OuterVolumeSpecName: "config-data") pod "1bdc20eb-9391-4fac-84ee-243d246d6d4d" (UID: "1bdc20eb-9391-4fac-84ee-243d246d6d4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:30.719889 master-2 kubenswrapper[4762]: I1014 13:41:30.719651 4762 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-ceilometer-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.719889 master-2 kubenswrapper[4762]: I1014 13:41:30.719700 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:30.719889 master-2 kubenswrapper[4762]: I1014 13:41:30.719710 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bdc20eb-9391-4fac-84ee-243d246d6d4d-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:31.372438 master-2 kubenswrapper[4762]: I1014 13:41:31.372387 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:31.406868 master-2 kubenswrapper[4762]: I1014 13:41:31.406764 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-r24zb" podStartSLOduration=4.406729242 podStartE2EDuration="4.406729242s" podCreationTimestamp="2025-10-14 13:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:41:30.673559381 +0000 UTC m=+2119.917718550" watchObservedRunningTime="2025-10-14 13:41:31.406729242 +0000 UTC m=+2120.650888401" Oct 14 13:41:31.418436 master-2 kubenswrapper[4762]: I1014 13:41:31.418280 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:31.428938 master-2 kubenswrapper[4762]: I1014 13:41:31.428849 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:31.467130 master-2 kubenswrapper[4762]: I1014 13:41:31.467064 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:31.467681 master-2 kubenswrapper[4762]: E1014 13:41:31.467650 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="sg-core" Oct 14 13:41:31.467681 master-2 kubenswrapper[4762]: I1014 13:41:31.467671 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="sg-core" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: E1014 13:41:31.467696 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-central-agent" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: I1014 13:41:31.467706 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-central-agent" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: E1014 13:41:31.467731 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-notification-agent" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: I1014 13:41:31.467742 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-notification-agent" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: E1014 13:41:31.467757 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="proxy-httpd" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: I1014 13:41:31.467765 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="proxy-httpd" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: I1014 13:41:31.467971 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-notification-agent" Oct 14 13:41:31.467992 master-2 kubenswrapper[4762]: I1014 13:41:31.467993 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="ceilometer-central-agent" Oct 14 13:41:31.468813 master-2 kubenswrapper[4762]: I1014 13:41:31.468011 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="proxy-httpd" Oct 14 13:41:31.468813 master-2 kubenswrapper[4762]: I1014 13:41:31.468018 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" containerName="sg-core" Oct 14 13:41:31.476962 master-2 kubenswrapper[4762]: I1014 13:41:31.473461 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:31.478363 master-2 kubenswrapper[4762]: I1014 13:41:31.478330 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 14 13:41:31.479576 master-2 kubenswrapper[4762]: I1014 13:41:31.479530 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 14 13:41:31.479789 master-2 kubenswrapper[4762]: I1014 13:41:31.479555 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 14 13:41:31.500615 master-2 kubenswrapper[4762]: I1014 13:41:31.490429 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:31.567953 master-2 kubenswrapper[4762]: I1014 13:41:31.562300 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdc20eb-9391-4fac-84ee-243d246d6d4d" path="/var/lib/kubelet/pods/1bdc20eb-9391-4fac-84ee-243d246d6d4d/volumes" Oct 14 13:41:31.643249 master-2 kubenswrapper[4762]: I1014 13:41:31.643181 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcw2s\" (UniqueName: \"kubernetes.io/projected/332f6683-954c-4eb5-8590-5a37a96edbde-kube-api-access-tcw2s\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643249 master-2 kubenswrapper[4762]: I1014 13:41:31.643258 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-log-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643541 master-2 kubenswrapper[4762]: I1014 13:41:31.643321 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-config-data\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643541 master-2 kubenswrapper[4762]: I1014 13:41:31.643519 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643607 master-2 kubenswrapper[4762]: I1014 13:41:31.643559 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-run-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643653 master-2 kubenswrapper[4762]: I1014 13:41:31.643630 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643799 master-2 kubenswrapper[4762]: I1014 13:41:31.643760 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.643852 master-2 kubenswrapper[4762]: I1014 13:41:31.643800 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-scripts\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.745817 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-run-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.745896 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746324 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746357 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-scripts\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746380 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-run-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746649 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcw2s\" (UniqueName: \"kubernetes.io/projected/332f6683-954c-4eb5-8590-5a37a96edbde-kube-api-access-tcw2s\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746702 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-log-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.746977 master-2 kubenswrapper[4762]: I1014 13:41:31.746787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-config-data\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.747719 master-2 kubenswrapper[4762]: I1014 13:41:31.747607 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.747719 master-2 kubenswrapper[4762]: I1014 13:41:31.747700 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/332f6683-954c-4eb5-8590-5a37a96edbde-log-httpd\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.756322 master-2 kubenswrapper[4762]: I1014 13:41:31.755959 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.756322 master-2 kubenswrapper[4762]: I1014 13:41:31.756034 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-config-data\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.756322 master-2 kubenswrapper[4762]: I1014 13:41:31.756307 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.757047 master-2 kubenswrapper[4762]: I1014 13:41:31.756759 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-scripts\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.762058 master-2 kubenswrapper[4762]: I1014 13:41:31.762013 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/332f6683-954c-4eb5-8590-5a37a96edbde-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.771146 master-2 kubenswrapper[4762]: I1014 13:41:31.771089 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcw2s\" (UniqueName: \"kubernetes.io/projected/332f6683-954c-4eb5-8590-5a37a96edbde-kube-api-access-tcw2s\") pod \"ceilometer-0\" (UID: \"332f6683-954c-4eb5-8590-5a37a96edbde\") " pod="openstack/ceilometer-0" Oct 14 13:41:31.856814 master-2 kubenswrapper[4762]: I1014 13:41:31.856747 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 14 13:41:32.296670 master-2 kubenswrapper[4762]: I1014 13:41:32.296618 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 14 13:41:33.560761 master-2 kubenswrapper[4762]: I1014 13:41:33.560691 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-nl86n" Oct 14 13:41:34.014198 master-2 kubenswrapper[4762]: I1014 13:41:34.013962 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:34.241842 master-2 kubenswrapper[4762]: I1014 13:41:34.241776 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:35.409304 master-2 kubenswrapper[4762]: I1014 13:41:35.409261 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"332f6683-954c-4eb5-8590-5a37a96edbde","Type":"ContainerStarted","Data":"ccad0b57837c9da6096a6001d0fd7aa4b65a7d56c731be7ee344aa7d6adfe417"} Oct 14 13:41:36.422024 master-2 kubenswrapper[4762]: I1014 13:41:36.421971 4762 generic.go:334] "Generic (PLEG): container finished" podID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerID="8206845d513a97ea7ea3356aafcf850050fff92b8f572dee9ac3734a67b08308" exitCode=0 Oct 14 13:41:36.422991 master-2 kubenswrapper[4762]: I1014 13:41:36.422060 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hwhwd" event={"ID":"d42f472f-fadd-471b-be3d-cfda97a7e407","Type":"ContainerDied","Data":"8206845d513a97ea7ea3356aafcf850050fff92b8f572dee9ac3734a67b08308"} Oct 14 13:41:36.424187 master-2 kubenswrapper[4762]: I1014 13:41:36.424137 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"332f6683-954c-4eb5-8590-5a37a96edbde","Type":"ContainerStarted","Data":"9457d2ad20a178c4ef54193e3e1ee0bf4714101b420579af96cf292acfb344ba"} Oct 14 13:41:37.435847 master-2 kubenswrapper[4762]: I1014 13:41:37.435789 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hwhwd" event={"ID":"d42f472f-fadd-471b-be3d-cfda97a7e407","Type":"ContainerStarted","Data":"98beb9b7c0dcf041ba2c732d9d1ab3a2cfad551891c3b7ace52c9fbfe84f9f70"} Oct 14 13:41:37.821915 master-2 kubenswrapper[4762]: I1014 13:41:37.821713 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-hwhwd" podStartSLOduration=3.755985676 podStartE2EDuration="18.821693965s" podCreationTimestamp="2025-10-14 13:41:19 +0000 UTC" firstStartedPulling="2025-10-14 13:41:20.298165333 +0000 UTC m=+2109.542324492" lastFinishedPulling="2025-10-14 13:41:35.363873602 +0000 UTC m=+2124.608032781" observedRunningTime="2025-10-14 13:41:37.461326697 +0000 UTC m=+2126.705485896" watchObservedRunningTime="2025-10-14 13:41:37.821693965 +0000 UTC m=+2127.065853124" Oct 14 13:41:38.448112 master-2 kubenswrapper[4762]: I1014 13:41:38.448044 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"332f6683-954c-4eb5-8590-5a37a96edbde","Type":"ContainerStarted","Data":"9c0b366400fba6e0daf7f63ca2fa9f3dd59acb4cb5dfdffa2fc7c7fcbef76d31"} Oct 14 13:41:39.458548 master-2 kubenswrapper[4762]: I1014 13:41:39.458413 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"332f6683-954c-4eb5-8590-5a37a96edbde","Type":"ContainerStarted","Data":"406f23bdb870fb164459c393dab9fcdefa20672c712e7c8ced123ac15e60ef7f"} Oct 14 13:41:39.906334 master-2 kubenswrapper[4762]: I1014 13:41:39.905570 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:40.144317 master-2 kubenswrapper[4762]: I1014 13:41:40.144237 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-c564bc7f-jdbq7" Oct 14 13:41:40.272496 master-2 kubenswrapper[4762]: I1014 13:41:40.272447 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:41:40.273081 master-2 kubenswrapper[4762]: I1014 13:41:40.272748 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6dc54675fc-lvsb4" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api" containerID="cri-o://a819a745f0156344a5c1aceea8457b5a978bf7e08bde3afd077f1b7d88e7e061" gracePeriod=30 Oct 14 13:41:40.273081 master-2 kubenswrapper[4762]: I1014 13:41:40.272891 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6dc54675fc-lvsb4" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api-provider-agent" containerID="cri-o://fe814d5c498d9742cc2e790ae17ff7d69ca120a5169e2288d15aa5b40a4de78b" gracePeriod=30 Oct 14 13:41:40.476372 master-2 kubenswrapper[4762]: I1014 13:41:40.476307 4762 generic.go:334] "Generic (PLEG): container finished" podID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerID="a74c81431913c6f9f5e45c4fb724bdd236fda3473562ecb486a8725ea8ac0e23" exitCode=0 Oct 14 13:41:40.477869 master-2 kubenswrapper[4762]: I1014 13:41:40.477821 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r24zb" event={"ID":"55e89c7d-f953-4fa3-95af-2abba3a06439","Type":"ContainerDied","Data":"a74c81431913c6f9f5e45c4fb724bdd236fda3473562ecb486a8725ea8ac0e23"} Oct 14 13:41:41.492724 master-2 kubenswrapper[4762]: I1014 13:41:41.492519 4762 generic.go:334] "Generic (PLEG): container finished" podID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerID="fe814d5c498d9742cc2e790ae17ff7d69ca120a5169e2288d15aa5b40a4de78b" exitCode=0 Oct 14 13:41:41.492724 master-2 kubenswrapper[4762]: I1014 13:41:41.492635 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerDied","Data":"fe814d5c498d9742cc2e790ae17ff7d69ca120a5169e2288d15aa5b40a4de78b"} Oct 14 13:41:41.499432 master-2 kubenswrapper[4762]: I1014 13:41:41.499378 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"332f6683-954c-4eb5-8590-5a37a96edbde","Type":"ContainerStarted","Data":"cfb0db467181ae89f46aaf91c4494ee319811be81f28f0da7e90ecb5a4d96d61"} Oct 14 13:41:41.499561 master-2 kubenswrapper[4762]: I1014 13:41:41.499473 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 14 13:41:41.605972 master-2 kubenswrapper[4762]: I1014 13:41:41.605834 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.873918906 podStartE2EDuration="10.605805158s" podCreationTimestamp="2025-10-14 13:41:31 +0000 UTC" firstStartedPulling="2025-10-14 13:41:34.9807017 +0000 UTC m=+2124.224860859" lastFinishedPulling="2025-10-14 13:41:40.712587952 +0000 UTC m=+2129.956747111" observedRunningTime="2025-10-14 13:41:41.586554558 +0000 UTC m=+2130.830713737" watchObservedRunningTime="2025-10-14 13:41:41.605805158 +0000 UTC m=+2130.849964317" Oct 14 13:41:42.369836 master-2 kubenswrapper[4762]: I1014 13:41:42.369786 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:42.467514 master-2 kubenswrapper[4762]: I1014 13:41:42.467440 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts\") pod \"55e89c7d-f953-4fa3-95af-2abba3a06439\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " Oct 14 13:41:42.467769 master-2 kubenswrapper[4762]: I1014 13:41:42.467734 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle\") pod \"55e89c7d-f953-4fa3-95af-2abba3a06439\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " Oct 14 13:41:42.467819 master-2 kubenswrapper[4762]: I1014 13:41:42.467793 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged\") pod \"55e89c7d-f953-4fa3-95af-2abba3a06439\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " Oct 14 13:41:42.467902 master-2 kubenswrapper[4762]: I1014 13:41:42.467873 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data\") pod \"55e89c7d-f953-4fa3-95af-2abba3a06439\" (UID: \"55e89c7d-f953-4fa3-95af-2abba3a06439\") " Oct 14 13:41:42.472036 master-2 kubenswrapper[4762]: I1014 13:41:42.471971 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts" (OuterVolumeSpecName: "scripts") pod "55e89c7d-f953-4fa3-95af-2abba3a06439" (UID: "55e89c7d-f953-4fa3-95af-2abba3a06439"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:42.473509 master-2 kubenswrapper[4762]: I1014 13:41:42.473445 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data" (OuterVolumeSpecName: "config-data") pod "55e89c7d-f953-4fa3-95af-2abba3a06439" (UID: "55e89c7d-f953-4fa3-95af-2abba3a06439"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:42.487051 master-2 kubenswrapper[4762]: I1014 13:41:42.487000 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "55e89c7d-f953-4fa3-95af-2abba3a06439" (UID: "55e89c7d-f953-4fa3-95af-2abba3a06439"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:42.493956 master-2 kubenswrapper[4762]: I1014 13:41:42.493695 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55e89c7d-f953-4fa3-95af-2abba3a06439" (UID: "55e89c7d-f953-4fa3-95af-2abba3a06439"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:42.509052 master-2 kubenswrapper[4762]: I1014 13:41:42.508997 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r24zb" event={"ID":"55e89c7d-f953-4fa3-95af-2abba3a06439","Type":"ContainerDied","Data":"5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933"} Oct 14 13:41:42.509252 master-2 kubenswrapper[4762]: I1014 13:41:42.509050 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r24zb" Oct 14 13:41:42.509252 master-2 kubenswrapper[4762]: I1014 13:41:42.509062 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d18e674d039ac8e6359d05c3a9a2d1e11e48e022f596f3805a1e6a50fd22933" Oct 14 13:41:42.571116 master-2 kubenswrapper[4762]: I1014 13:41:42.570443 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:42.571116 master-2 kubenswrapper[4762]: I1014 13:41:42.570479 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:42.571116 master-2 kubenswrapper[4762]: I1014 13:41:42.570493 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data-merged\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:42.571116 master-2 kubenswrapper[4762]: I1014 13:41:42.570505 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55e89c7d-f953-4fa3-95af-2abba3a06439-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:43.525001 master-2 kubenswrapper[4762]: I1014 13:41:43.524950 4762 generic.go:334] "Generic (PLEG): container finished" podID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerID="a819a745f0156344a5c1aceea8457b5a978bf7e08bde3afd077f1b7d88e7e061" exitCode=0 Oct 14 13:41:43.527440 master-2 kubenswrapper[4762]: I1014 13:41:43.525025 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerDied","Data":"a819a745f0156344a5c1aceea8457b5a978bf7e08bde3afd077f1b7d88e7e061"} Oct 14 13:41:44.104073 master-2 kubenswrapper[4762]: I1014 13:41:44.104016 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:44.214428 master-2 kubenswrapper[4762]: I1014 13:41:44.214327 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214428 master-2 kubenswrapper[4762]: I1014 13:41:44.214405 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214761 master-2 kubenswrapper[4762]: I1014 13:41:44.214466 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214761 master-2 kubenswrapper[4762]: I1014 13:41:44.214555 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214761 master-2 kubenswrapper[4762]: I1014 13:41:44.214621 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214761 master-2 kubenswrapper[4762]: I1014 13:41:44.214714 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs\") pod \"35e45c95-9f29-4266-828c-d8cc7e37c091\" (UID: \"35e45c95-9f29-4266-828c-d8cc7e37c091\") " Oct 14 13:41:44.214979 master-2 kubenswrapper[4762]: I1014 13:41:44.214780 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:44.216236 master-2 kubenswrapper[4762]: I1014 13:41:44.216191 4762 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-octavia-run\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.218405 master-2 kubenswrapper[4762]: I1014 13:41:44.218336 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts" (OuterVolumeSpecName: "scripts") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:44.218508 master-2 kubenswrapper[4762]: I1014 13:41:44.218389 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data" (OuterVolumeSpecName: "config-data") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:44.256301 master-2 kubenswrapper[4762]: I1014 13:41:44.256250 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:41:44.257704 master-2 kubenswrapper[4762]: I1014 13:41:44.257651 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:44.318216 master-2 kubenswrapper[4762]: I1014 13:41:44.318128 4762 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-combined-ca-bundle\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.318216 master-2 kubenswrapper[4762]: I1014 13:41:44.318179 4762 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data-merged\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.318216 master-2 kubenswrapper[4762]: I1014 13:41:44.318190 4762 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-scripts\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.318216 master-2 kubenswrapper[4762]: I1014 13:41:44.318200 4762 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-config-data\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.389263 master-2 kubenswrapper[4762]: I1014 13:41:44.389197 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "35e45c95-9f29-4266-828c-d8cc7e37c091" (UID: "35e45c95-9f29-4266-828c-d8cc7e37c091"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:41:44.419991 master-2 kubenswrapper[4762]: I1014 13:41:44.419888 4762 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e45c95-9f29-4266-828c-d8cc7e37c091-ovndb-tls-certs\") on node \"master-2\" DevicePath \"\"" Oct 14 13:41:44.539991 master-2 kubenswrapper[4762]: I1014 13:41:44.539849 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6dc54675fc-lvsb4" event={"ID":"35e45c95-9f29-4266-828c-d8cc7e37c091","Type":"ContainerDied","Data":"0bd6d59d366c9df347f00dc7e90ed0f830024777976083a6fb9cc746ce29f464"} Oct 14 13:41:44.540975 master-2 kubenswrapper[4762]: I1014 13:41:44.540935 4762 scope.go:117] "RemoveContainer" containerID="fe814d5c498d9742cc2e790ae17ff7d69ca120a5169e2288d15aa5b40a4de78b" Oct 14 13:41:44.541375 master-2 kubenswrapper[4762]: I1014 13:41:44.540203 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6dc54675fc-lvsb4" Oct 14 13:41:44.590371 master-2 kubenswrapper[4762]: I1014 13:41:44.590307 4762 scope.go:117] "RemoveContainer" containerID="a819a745f0156344a5c1aceea8457b5a978bf7e08bde3afd077f1b7d88e7e061" Oct 14 13:41:44.614027 master-2 kubenswrapper[4762]: I1014 13:41:44.613947 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:41:44.622672 master-2 kubenswrapper[4762]: I1014 13:41:44.622602 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-6dc54675fc-lvsb4"] Oct 14 13:41:44.628041 master-2 kubenswrapper[4762]: I1014 13:41:44.628006 4762 scope.go:117] "RemoveContainer" containerID="790eec6a1db467198a5c822789307b2b4f5a2ba59c964ef6f73b36113b29ed0c" Oct 14 13:41:45.559871 master-2 kubenswrapper[4762]: I1014 13:41:45.559754 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" path="/var/lib/kubelet/pods/35e45c95-9f29-4266-828c-d8cc7e37c091/volumes" Oct 14 13:42:01.866019 master-2 kubenswrapper[4762]: I1014 13:42:01.865950 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 14 13:42:03.519011 master-2 kubenswrapper[4762]: I1014 13:42:03.518936 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:42:03.519768 master-2 kubenswrapper[4762]: I1014 13:42:03.519489 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-678599687f-hwhwd" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="octavia-amphora-httpd" containerID="cri-o://98beb9b7c0dcf041ba2c732d9d1ab3a2cfad551891c3b7ace52c9fbfe84f9f70" gracePeriod=30 Oct 14 13:42:03.756721 master-2 kubenswrapper[4762]: I1014 13:42:03.756643 4762 generic.go:334] "Generic (PLEG): container finished" podID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerID="98beb9b7c0dcf041ba2c732d9d1ab3a2cfad551891c3b7ace52c9fbfe84f9f70" exitCode=0 Oct 14 13:42:03.756721 master-2 kubenswrapper[4762]: I1014 13:42:03.756710 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hwhwd" event={"ID":"d42f472f-fadd-471b-be3d-cfda97a7e407","Type":"ContainerDied","Data":"98beb9b7c0dcf041ba2c732d9d1ab3a2cfad551891c3b7ace52c9fbfe84f9f70"} Oct 14 13:42:04.193492 master-2 kubenswrapper[4762]: I1014 13:42:04.193452 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:42:04.266497 master-2 kubenswrapper[4762]: I1014 13:42:04.266049 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config\") pod \"d42f472f-fadd-471b-be3d-cfda97a7e407\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " Oct 14 13:42:04.266497 master-2 kubenswrapper[4762]: I1014 13:42:04.266137 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image\") pod \"d42f472f-fadd-471b-be3d-cfda97a7e407\" (UID: \"d42f472f-fadd-471b-be3d-cfda97a7e407\") " Oct 14 13:42:04.296317 master-2 kubenswrapper[4762]: I1014 13:42:04.296249 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d42f472f-fadd-471b-be3d-cfda97a7e407" (UID: "d42f472f-fadd-471b-be3d-cfda97a7e407"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 14 13:42:04.360380 master-2 kubenswrapper[4762]: I1014 13:42:04.360248 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "d42f472f-fadd-471b-be3d-cfda97a7e407" (UID: "d42f472f-fadd-471b-be3d-cfda97a7e407"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:42:04.372918 master-2 kubenswrapper[4762]: I1014 13:42:04.372862 4762 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d42f472f-fadd-471b-be3d-cfda97a7e407-httpd-config\") on node \"master-2\" DevicePath \"\"" Oct 14 13:42:04.372918 master-2 kubenswrapper[4762]: I1014 13:42:04.372901 4762 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d42f472f-fadd-471b-be3d-cfda97a7e407-amphora-image\") on node \"master-2\" DevicePath \"\"" Oct 14 13:42:04.770089 master-2 kubenswrapper[4762]: I1014 13:42:04.770009 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hwhwd" event={"ID":"d42f472f-fadd-471b-be3d-cfda97a7e407","Type":"ContainerDied","Data":"cee7e67f1de649810aca0a0f0ff3ee7acc72016b20c416d179925fe5750744aa"} Oct 14 13:42:04.770985 master-2 kubenswrapper[4762]: I1014 13:42:04.770106 4762 scope.go:117] "RemoveContainer" containerID="98beb9b7c0dcf041ba2c732d9d1ab3a2cfad551891c3b7ace52c9fbfe84f9f70" Oct 14 13:42:04.771231 master-2 kubenswrapper[4762]: I1014 13:42:04.771184 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hwhwd" Oct 14 13:42:04.813625 master-2 kubenswrapper[4762]: I1014 13:42:04.813358 4762 scope.go:117] "RemoveContainer" containerID="8206845d513a97ea7ea3356aafcf850050fff92b8f572dee9ac3734a67b08308" Oct 14 13:42:04.836076 master-2 kubenswrapper[4762]: I1014 13:42:04.835982 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:42:04.844312 master-2 kubenswrapper[4762]: I1014 13:42:04.843559 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-678599687f-hwhwd"] Oct 14 13:42:05.562686 master-2 kubenswrapper[4762]: I1014 13:42:05.562598 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" path="/var/lib/kubelet/pods/d42f472f-fadd-471b-be3d-cfda97a7e407/volumes" Oct 14 13:42:09.351259 master-2 kubenswrapper[4762]: I1014 13:42:09.351135 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-xzrp9"] Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351680 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351701 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351719 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351729 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351754 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="octavia-amphora-httpd" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351763 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="octavia-amphora-httpd" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351775 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351783 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351803 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api-provider-agent" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351813 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api-provider-agent" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351894 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351905 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerName="init" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: E1014 13:42:09.351916 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerName="octavia-db-sync" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.351926 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerName="octavia-db-sync" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.352120 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api-provider-agent" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.352134 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e45c95-9f29-4266-828c-d8cc7e37c091" containerName="octavia-api" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.352178 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" containerName="octavia-db-sync" Oct 14 13:42:09.352179 master-2 kubenswrapper[4762]: I1014 13:42:09.352197 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42f472f-fadd-471b-be3d-cfda97a7e407" containerName="octavia-amphora-httpd" Oct 14 13:42:09.354972 master-2 kubenswrapper[4762]: I1014 13:42:09.354622 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.362338 master-2 kubenswrapper[4762]: I1014 13:42:09.362268 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 14 13:42:09.376817 master-2 kubenswrapper[4762]: I1014 13:42:09.376739 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-xzrp9"] Oct 14 13:42:09.491269 master-2 kubenswrapper[4762]: I1014 13:42:09.491146 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-httpd-config\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.491845 master-2 kubenswrapper[4762]: I1014 13:42:09.491349 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-amphora-image\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.592845 master-2 kubenswrapper[4762]: I1014 13:42:09.592794 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-amphora-image\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.593118 master-2 kubenswrapper[4762]: I1014 13:42:09.592933 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-httpd-config\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.595134 master-2 kubenswrapper[4762]: I1014 13:42:09.595070 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-amphora-image\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.622961 master-2 kubenswrapper[4762]: I1014 13:42:09.622841 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86-httpd-config\") pod \"octavia-image-upload-678599687f-xzrp9\" (UID: \"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86\") " pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:09.702181 master-2 kubenswrapper[4762]: I1014 13:42:09.702111 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-xzrp9" Oct 14 13:42:10.193376 master-2 kubenswrapper[4762]: I1014 13:42:10.193320 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-xzrp9"] Oct 14 13:42:10.200587 master-2 kubenswrapper[4762]: W1014 13:42:10.200537 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96b9f0dc_cfe9_4e14_8267_7dd3fb6e3d86.slice/crio-3d642cc381e6a1214e48c0a43fc0e1fcf93fe3a0209ff3b7bbeff2b87a0d0125 WatchSource:0}: Error finding container 3d642cc381e6a1214e48c0a43fc0e1fcf93fe3a0209ff3b7bbeff2b87a0d0125: Status 404 returned error can't find the container with id 3d642cc381e6a1214e48c0a43fc0e1fcf93fe3a0209ff3b7bbeff2b87a0d0125 Oct 14 13:42:10.842805 master-2 kubenswrapper[4762]: I1014 13:42:10.842712 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-xzrp9" event={"ID":"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86","Type":"ContainerStarted","Data":"3d642cc381e6a1214e48c0a43fc0e1fcf93fe3a0209ff3b7bbeff2b87a0d0125"} Oct 14 13:42:11.853884 master-2 kubenswrapper[4762]: I1014 13:42:11.853804 4762 generic.go:334] "Generic (PLEG): container finished" podID="96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86" containerID="d9a8650449b9b67b17bf3376a26fb576f528aacb1791a62cd38746f330a37278" exitCode=0 Oct 14 13:42:11.853884 master-2 kubenswrapper[4762]: I1014 13:42:11.853855 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-xzrp9" event={"ID":"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86","Type":"ContainerDied","Data":"d9a8650449b9b67b17bf3376a26fb576f528aacb1791a62cd38746f330a37278"} Oct 14 13:42:12.866053 master-2 kubenswrapper[4762]: I1014 13:42:12.866002 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-xzrp9" event={"ID":"96b9f0dc-cfe9-4e14-8267-7dd3fb6e3d86","Type":"ContainerStarted","Data":"954cfc422334f6e6bf52c047633edd29bdebf11d715e08aab093251f24b25943"} Oct 14 13:42:12.905946 master-2 kubenswrapper[4762]: I1014 13:42:12.905809 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-xzrp9" podStartSLOduration=3.454808762 podStartE2EDuration="3.905746332s" podCreationTimestamp="2025-10-14 13:42:09 +0000 UTC" firstStartedPulling="2025-10-14 13:42:10.20419121 +0000 UTC m=+2159.448350369" lastFinishedPulling="2025-10-14 13:42:10.65512878 +0000 UTC m=+2159.899287939" observedRunningTime="2025-10-14 13:42:12.89210561 +0000 UTC m=+2162.136264789" watchObservedRunningTime="2025-10-14 13:42:12.905746332 +0000 UTC m=+2162.149905521" Oct 14 13:42:23.441582 master-2 kubenswrapper[4762]: I1014 13:42:23.441517 4762 scope.go:117] "RemoveContainer" containerID="0dc728929be840cf856fc9818c135fbd54ffa7a02abc7eb031c207658c929aca" Oct 14 13:42:23.994482 master-2 kubenswrapper[4762]: I1014 13:42:23.994418 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-d89pd"] Oct 14 13:42:23.995871 master-2 kubenswrapper[4762]: I1014 13:42:23.995836 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:23.999450 master-2 kubenswrapper[4762]: I1014 13:42:23.999401 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 14 13:42:23.999572 master-2 kubenswrapper[4762]: I1014 13:42:23.999531 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 14 13:42:24.000274 master-2 kubenswrapper[4762]: I1014 13:42:24.000244 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 14 13:42:24.016646 master-2 kubenswrapper[4762]: I1014 13:42:24.016543 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-d89pd"] Oct 14 13:42:24.111077 master-2 kubenswrapper[4762]: I1014 13:42:24.110994 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e890300-f642-422f-9b89-0d7387f1ea34-config-data-merged\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.111355 master-2 kubenswrapper[4762]: I1014 13:42:24.111091 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e890300-f642-422f-9b89-0d7387f1ea34-hm-ports\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.111355 master-2 kubenswrapper[4762]: I1014 13:42:24.111132 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-scripts\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.111355 master-2 kubenswrapper[4762]: I1014 13:42:24.111180 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-amphora-certs\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.111355 master-2 kubenswrapper[4762]: I1014 13:42:24.111199 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-combined-ca-bundle\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.111355 master-2 kubenswrapper[4762]: I1014 13:42:24.111242 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-config-data\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.214483 master-2 kubenswrapper[4762]: I1014 13:42:24.213245 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e890300-f642-422f-9b89-0d7387f1ea34-hm-ports\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.214746 master-2 kubenswrapper[4762]: I1014 13:42:24.214562 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-scripts\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.214746 master-2 kubenswrapper[4762]: I1014 13:42:24.214693 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-amphora-certs\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.214843 master-2 kubenswrapper[4762]: I1014 13:42:24.214756 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-combined-ca-bundle\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.214984 master-2 kubenswrapper[4762]: I1014 13:42:24.214911 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-config-data\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.215182 master-2 kubenswrapper[4762]: I1014 13:42:24.214699 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8e890300-f642-422f-9b89-0d7387f1ea34-hm-ports\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.215292 master-2 kubenswrapper[4762]: I1014 13:42:24.215108 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e890300-f642-422f-9b89-0d7387f1ea34-config-data-merged\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.215871 master-2 kubenswrapper[4762]: I1014 13:42:24.215812 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8e890300-f642-422f-9b89-0d7387f1ea34-config-data-merged\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.220460 master-2 kubenswrapper[4762]: I1014 13:42:24.220409 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-amphora-certs\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.220722 master-2 kubenswrapper[4762]: I1014 13:42:24.220665 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-config-data\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.221921 master-2 kubenswrapper[4762]: I1014 13:42:24.221863 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-scripts\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.224745 master-2 kubenswrapper[4762]: I1014 13:42:24.224680 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e890300-f642-422f-9b89-0d7387f1ea34-combined-ca-bundle\") pod \"octavia-healthmanager-d89pd\" (UID: \"8e890300-f642-422f-9b89-0d7387f1ea34\") " pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:24.321259 master-2 kubenswrapper[4762]: I1014 13:42:24.321043 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:25.587099 master-2 kubenswrapper[4762]: W1014 13:42:25.587028 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e890300_f642_422f_9b89_0d7387f1ea34.slice/crio-d8e7bb393e5f29beef6c271a74c59d525800910b6e4cfb1670bf2a4db72b1c44 WatchSource:0}: Error finding container d8e7bb393e5f29beef6c271a74c59d525800910b6e4cfb1670bf2a4db72b1c44: Status 404 returned error can't find the container with id d8e7bb393e5f29beef6c271a74c59d525800910b6e4cfb1670bf2a4db72b1c44 Oct 14 13:42:25.587578 master-2 kubenswrapper[4762]: I1014 13:42:25.587424 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-d89pd"] Oct 14 13:42:25.737796 master-2 kubenswrapper[4762]: I1014 13:42:25.737638 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-n8xhp"] Oct 14 13:42:25.746344 master-2 kubenswrapper[4762]: I1014 13:42:25.746277 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.781194 master-2 kubenswrapper[4762]: W1014 13:42:25.781134 4762 reflector.go:561] object-"openstack"/"octavia-housekeeping-config-data": failed to list *v1.Secret: secrets "octavia-housekeeping-config-data" is forbidden: User "system:node:master-2" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'master-2' and this object Oct 14 13:42:25.781535 master-2 kubenswrapper[4762]: E1014 13:42:25.781514 4762 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"octavia-housekeeping-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"octavia-housekeeping-config-data\" is forbidden: User \"system:node:master-2\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'master-2' and this object" logger="UnhandledError" Oct 14 13:42:25.781628 master-2 kubenswrapper[4762]: W1014 13:42:25.781145 4762 reflector.go:561] object-"openstack"/"octavia-housekeeping-scripts": failed to list *v1.Secret: secrets "octavia-housekeeping-scripts" is forbidden: User "system:node:master-2" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'master-2' and this object Oct 14 13:42:25.781715 master-2 kubenswrapper[4762]: E1014 13:42:25.781698 4762 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"octavia-housekeeping-scripts\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"octavia-housekeeping-scripts\" is forbidden: User \"system:node:master-2\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'master-2' and this object" logger="UnhandledError" Oct 14 13:42:25.804574 master-2 kubenswrapper[4762]: I1014 13:42:25.802251 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-n8xhp"] Oct 14 13:42:25.857383 master-2 kubenswrapper[4762]: I1014 13:42:25.857319 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.857383 master-2 kubenswrapper[4762]: I1014 13:42:25.857380 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bac81500-a3af-432d-8664-466b3590f2f6-config-data-merged\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.857798 master-2 kubenswrapper[4762]: I1014 13:42:25.857416 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-amphora-certs\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.857798 master-2 kubenswrapper[4762]: I1014 13:42:25.857471 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-combined-ca-bundle\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.857798 master-2 kubenswrapper[4762]: I1014 13:42:25.857491 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-scripts\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.857798 master-2 kubenswrapper[4762]: I1014 13:42:25.857692 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bac81500-a3af-432d-8664-466b3590f2f6-hm-ports\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.960725 master-2 kubenswrapper[4762]: I1014 13:42:25.960632 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-combined-ca-bundle\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.961220 master-2 kubenswrapper[4762]: I1014 13:42:25.961194 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-scripts\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.961415 master-2 kubenswrapper[4762]: I1014 13:42:25.961394 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bac81500-a3af-432d-8664-466b3590f2f6-hm-ports\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.961605 master-2 kubenswrapper[4762]: I1014 13:42:25.961587 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.961743 master-2 kubenswrapper[4762]: I1014 13:42:25.961729 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bac81500-a3af-432d-8664-466b3590f2f6-config-data-merged\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.961856 master-2 kubenswrapper[4762]: I1014 13:42:25.961841 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-amphora-certs\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.963257 master-2 kubenswrapper[4762]: I1014 13:42:25.963203 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bac81500-a3af-432d-8664-466b3590f2f6-hm-ports\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.963500 master-2 kubenswrapper[4762]: I1014 13:42:25.963445 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bac81500-a3af-432d-8664-466b3590f2f6-config-data-merged\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.975103 master-2 kubenswrapper[4762]: I1014 13:42:25.975028 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-amphora-certs\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:25.975103 master-2 kubenswrapper[4762]: I1014 13:42:25.975046 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-combined-ca-bundle\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:26.008981 master-2 kubenswrapper[4762]: I1014 13:42:26.008923 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d89pd" event={"ID":"8e890300-f642-422f-9b89-0d7387f1ea34","Type":"ContainerStarted","Data":"d8e7bb393e5f29beef6c271a74c59d525800910b6e4cfb1670bf2a4db72b1c44"} Oct 14 13:42:26.887502 master-2 kubenswrapper[4762]: I1014 13:42:26.887417 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 14 13:42:26.895706 master-2 kubenswrapper[4762]: I1014 13:42:26.895637 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-scripts\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:26.961432 master-2 kubenswrapper[4762]: I1014 13:42:26.961347 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-mt69p"] Oct 14 13:42:26.962846 master-2 kubenswrapper[4762]: I1014 13:42:26.962808 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-mt69p" Oct 14 13:42:26.962846 master-2 kubenswrapper[4762]: E1014 13:42:26.962815 4762 secret.go:189] Couldn't get secret openstack/octavia-housekeeping-config-data: failed to sync secret cache: timed out waiting for the condition Oct 14 13:42:26.962981 master-2 kubenswrapper[4762]: E1014 13:42:26.962926 4762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data podName:bac81500-a3af-432d-8664-466b3590f2f6 nodeName:}" failed. No retries permitted until 2025-10-14 13:42:27.462896838 +0000 UTC m=+2176.707055997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data") pod "octavia-housekeeping-n8xhp" (UID: "bac81500-a3af-432d-8664-466b3590f2f6") : failed to sync secret cache: timed out waiting for the condition Oct 14 13:42:26.967641 master-2 kubenswrapper[4762]: I1014 13:42:26.967570 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 14 13:42:26.967641 master-2 kubenswrapper[4762]: I1014 13:42:26.967616 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 14 13:42:26.997450 master-2 kubenswrapper[4762]: I1014 13:42:26.997322 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-mt69p"] Oct 14 13:42:27.029810 master-2 kubenswrapper[4762]: I1014 13:42:27.029762 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d89pd" event={"ID":"8e890300-f642-422f-9b89-0d7387f1ea34","Type":"ContainerStarted","Data":"0debe1793f4cbc0f1b78826190d1460be84e71f336bdaf574a23670560ffa0b9"} Oct 14 13:42:27.085837 master-2 kubenswrapper[4762]: I1014 13:42:27.085767 4762 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 14 13:42:27.089954 master-2 kubenswrapper[4762]: I1014 13:42:27.089900 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-combined-ca-bundle\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.090139 master-2 kubenswrapper[4762]: I1014 13:42:27.090104 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-scripts\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.090222 master-2 kubenswrapper[4762]: I1014 13:42:27.090172 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-amphora-certs\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.090271 master-2 kubenswrapper[4762]: I1014 13:42:27.090221 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/083824da-e520-4858-86be-e2ffe26c0610-config-data-merged\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.090378 master-2 kubenswrapper[4762]: I1014 13:42:27.090315 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/083824da-e520-4858-86be-e2ffe26c0610-hm-ports\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.090481 master-2 kubenswrapper[4762]: I1014 13:42:27.090444 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-config-data\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.193797 master-2 kubenswrapper[4762]: I1014 13:42:27.193670 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-combined-ca-bundle\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.194734 master-2 kubenswrapper[4762]: I1014 13:42:27.194669 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-scripts\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.195552 master-2 kubenswrapper[4762]: I1014 13:42:27.195337 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-amphora-certs\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.195552 master-2 kubenswrapper[4762]: I1014 13:42:27.195416 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/083824da-e520-4858-86be-e2ffe26c0610-config-data-merged\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.195552 master-2 kubenswrapper[4762]: I1014 13:42:27.195445 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/083824da-e520-4858-86be-e2ffe26c0610-hm-ports\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.195552 master-2 kubenswrapper[4762]: I1014 13:42:27.195505 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-config-data\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.196290 master-2 kubenswrapper[4762]: I1014 13:42:27.196224 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/083824da-e520-4858-86be-e2ffe26c0610-config-data-merged\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.198474 master-2 kubenswrapper[4762]: I1014 13:42:27.198418 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-scripts\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.198605 master-2 kubenswrapper[4762]: I1014 13:42:27.198480 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/083824da-e520-4858-86be-e2ffe26c0610-hm-ports\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.200491 master-2 kubenswrapper[4762]: I1014 13:42:27.200462 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-combined-ca-bundle\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.200966 master-2 kubenswrapper[4762]: I1014 13:42:27.200942 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-config-data\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.206701 master-2 kubenswrapper[4762]: I1014 13:42:27.206621 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/083824da-e520-4858-86be-e2ffe26c0610-amphora-certs\") pod \"octavia-worker-mt69p\" (UID: \"083824da-e520-4858-86be-e2ffe26c0610\") " pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.292113 master-2 kubenswrapper[4762]: I1014 13:42:27.292021 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-mt69p" Oct 14 13:42:27.508825 master-2 kubenswrapper[4762]: I1014 13:42:27.508764 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:27.512723 master-2 kubenswrapper[4762]: I1014 13:42:27.512681 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bac81500-a3af-432d-8664-466b3590f2f6-config-data\") pod \"octavia-housekeeping-n8xhp\" (UID: \"bac81500-a3af-432d-8664-466b3590f2f6\") " pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:27.612044 master-2 kubenswrapper[4762]: I1014 13:42:27.611958 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:28.043174 master-2 kubenswrapper[4762]: I1014 13:42:28.042690 4762 generic.go:334] "Generic (PLEG): container finished" podID="8e890300-f642-422f-9b89-0d7387f1ea34" containerID="0debe1793f4cbc0f1b78826190d1460be84e71f336bdaf574a23670560ffa0b9" exitCode=0 Oct 14 13:42:28.043174 master-2 kubenswrapper[4762]: I1014 13:42:28.042746 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d89pd" event={"ID":"8e890300-f642-422f-9b89-0d7387f1ea34","Type":"ContainerDied","Data":"0debe1793f4cbc0f1b78826190d1460be84e71f336bdaf574a23670560ffa0b9"} Oct 14 13:42:28.189137 master-2 kubenswrapper[4762]: I1014 13:42:28.189095 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-n8xhp"] Oct 14 13:42:28.935362 master-2 kubenswrapper[4762]: I1014 13:42:28.935286 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-mt69p"] Oct 14 13:42:28.935507 master-2 kubenswrapper[4762]: W1014 13:42:28.935428 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod083824da_e520_4858_86be_e2ffe26c0610.slice/crio-dd1f0567348e51a184e24e521e67221a6d89060b86692b45e3f66698eece5482 WatchSource:0}: Error finding container dd1f0567348e51a184e24e521e67221a6d89060b86692b45e3f66698eece5482: Status 404 returned error can't find the container with id dd1f0567348e51a184e24e521e67221a6d89060b86692b45e3f66698eece5482 Oct 14 13:42:29.063906 master-2 kubenswrapper[4762]: I1014 13:42:29.063838 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-mt69p" event={"ID":"083824da-e520-4858-86be-e2ffe26c0610","Type":"ContainerStarted","Data":"dd1f0567348e51a184e24e521e67221a6d89060b86692b45e3f66698eece5482"} Oct 14 13:42:29.065876 master-2 kubenswrapper[4762]: I1014 13:42:29.065832 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n8xhp" event={"ID":"bac81500-a3af-432d-8664-466b3590f2f6","Type":"ContainerStarted","Data":"f68413090949d6edc8b03a2e481b3d67e18cdf53b0e9857d78757a1457e8d316"} Oct 14 13:42:29.069105 master-2 kubenswrapper[4762]: I1014 13:42:29.069034 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d89pd" event={"ID":"8e890300-f642-422f-9b89-0d7387f1ea34","Type":"ContainerStarted","Data":"9ac9eec2a179be8a81ec03560685c740977dd34f1e22ab9fb0112bac6599828f"} Oct 14 13:42:29.069474 master-2 kubenswrapper[4762]: I1014 13:42:29.069449 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:29.107300 master-2 kubenswrapper[4762]: I1014 13:42:29.107103 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-d89pd" podStartSLOduration=6.107070584 podStartE2EDuration="6.107070584s" podCreationTimestamp="2025-10-14 13:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 13:42:29.097771339 +0000 UTC m=+2178.341930508" watchObservedRunningTime="2025-10-14 13:42:29.107070584 +0000 UTC m=+2178.351229763" Oct 14 13:42:31.092435 master-2 kubenswrapper[4762]: I1014 13:42:31.092245 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n8xhp" event={"ID":"bac81500-a3af-432d-8664-466b3590f2f6","Type":"ContainerStarted","Data":"4d510a87b03a2726b57410cd370658af8e949b1ec874710cb9c1d161ca913a64"} Oct 14 13:42:32.114705 master-2 kubenswrapper[4762]: I1014 13:42:32.114610 4762 generic.go:334] "Generic (PLEG): container finished" podID="bac81500-a3af-432d-8664-466b3590f2f6" containerID="4d510a87b03a2726b57410cd370658af8e949b1ec874710cb9c1d161ca913a64" exitCode=0 Oct 14 13:42:32.115549 master-2 kubenswrapper[4762]: I1014 13:42:32.114733 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n8xhp" event={"ID":"bac81500-a3af-432d-8664-466b3590f2f6","Type":"ContainerDied","Data":"4d510a87b03a2726b57410cd370658af8e949b1ec874710cb9c1d161ca913a64"} Oct 14 13:42:32.119756 master-2 kubenswrapper[4762]: I1014 13:42:32.119700 4762 generic.go:334] "Generic (PLEG): container finished" podID="083824da-e520-4858-86be-e2ffe26c0610" containerID="e03ece7b90016e440473aa396029197f18e1b3de3b7e9fd76e45de5cdd5aae6a" exitCode=0 Oct 14 13:42:32.119871 master-2 kubenswrapper[4762]: I1014 13:42:32.119755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-mt69p" event={"ID":"083824da-e520-4858-86be-e2ffe26c0610","Type":"ContainerDied","Data":"e03ece7b90016e440473aa396029197f18e1b3de3b7e9fd76e45de5cdd5aae6a"} Oct 14 13:42:33.132486 master-2 kubenswrapper[4762]: I1014 13:42:33.132395 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-mt69p" event={"ID":"083824da-e520-4858-86be-e2ffe26c0610","Type":"ContainerStarted","Data":"5d241a97a4cdf207259e730b19fcee733c58e97bc65a7f4c0f7e3b4a56974bb9"} Oct 14 13:42:33.133000 master-2 kubenswrapper[4762]: I1014 13:42:33.132580 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-mt69p" Oct 14 13:42:33.134004 master-2 kubenswrapper[4762]: I1014 13:42:33.133976 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n8xhp" event={"ID":"bac81500-a3af-432d-8664-466b3590f2f6","Type":"ContainerStarted","Data":"83ef8a6e755be79d89bfbb51b11647038f01e1cf7bdfa81aa26851b3423749d0"} Oct 14 13:42:33.134193 master-2 kubenswrapper[4762]: I1014 13:42:33.134172 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:42:33.165842 master-2 kubenswrapper[4762]: I1014 13:42:33.165751 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-mt69p" podStartSLOduration=5.37296194 podStartE2EDuration="7.165728173s" podCreationTimestamp="2025-10-14 13:42:26 +0000 UTC" firstStartedPulling="2025-10-14 13:42:28.938484118 +0000 UTC m=+2178.182643277" lastFinishedPulling="2025-10-14 13:42:30.731250351 +0000 UTC m=+2179.975409510" observedRunningTime="2025-10-14 13:42:33.158882326 +0000 UTC m=+2182.403041515" watchObservedRunningTime="2025-10-14 13:42:33.165728173 +0000 UTC m=+2182.409887332" Oct 14 13:42:33.209531 master-2 kubenswrapper[4762]: I1014 13:42:33.209427 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-n8xhp" podStartSLOduration=6.497973706 podStartE2EDuration="8.209399158s" podCreationTimestamp="2025-10-14 13:42:25 +0000 UTC" firstStartedPulling="2025-10-14 13:42:28.200396752 +0000 UTC m=+2177.444555911" lastFinishedPulling="2025-10-14 13:42:29.911822194 +0000 UTC m=+2179.155981363" observedRunningTime="2025-10-14 13:42:33.205458764 +0000 UTC m=+2182.449617923" watchObservedRunningTime="2025-10-14 13:42:33.209399158 +0000 UTC m=+2182.453558317" Oct 14 13:42:39.378019 master-2 kubenswrapper[4762]: I1014 13:42:39.377965 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-d89pd" Oct 14 13:42:42.347397 master-2 kubenswrapper[4762]: I1014 13:42:42.347314 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-mt69p" Oct 14 13:42:42.657101 master-2 kubenswrapper[4762]: I1014 13:42:42.656310 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-n8xhp" Oct 14 13:43:19.410718 master-2 kubenswrapper[4762]: I1014 13:43:19.409727 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:19.412911 master-2 kubenswrapper[4762]: I1014 13:43:19.412852 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.417580 master-2 kubenswrapper[4762]: I1014 13:43:19.417507 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:19.560712 master-2 kubenswrapper[4762]: I1014 13:43:19.560654 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.561715 master-2 kubenswrapper[4762]: I1014 13:43:19.561665 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wsfl\" (UniqueName: \"kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.562588 master-2 kubenswrapper[4762]: I1014 13:43:19.562335 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.664867 master-2 kubenswrapper[4762]: I1014 13:43:19.664696 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.664867 master-2 kubenswrapper[4762]: I1014 13:43:19.664754 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.665316 master-2 kubenswrapper[4762]: I1014 13:43:19.664887 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wsfl\" (UniqueName: \"kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.665790 master-2 kubenswrapper[4762]: I1014 13:43:19.665698 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.665954 master-2 kubenswrapper[4762]: I1014 13:43:19.665918 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.698115 master-2 kubenswrapper[4762]: I1014 13:43:19.698051 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wsfl\" (UniqueName: \"kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl\") pod \"community-operators-g6dqm\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:19.749743 master-2 kubenswrapper[4762]: I1014 13:43:19.749685 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:20.257463 master-2 kubenswrapper[4762]: I1014 13:43:20.257422 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:20.265194 master-2 kubenswrapper[4762]: W1014 13:43:20.265131 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e02999_7959_40a8_883c_4f5aafab78b4.slice/crio-a1797d70e75a1fbd6870531590d90d5432fbb8d001215a22bf8a6aee96802a37 WatchSource:0}: Error finding container a1797d70e75a1fbd6870531590d90d5432fbb8d001215a22bf8a6aee96802a37: Status 404 returned error can't find the container with id a1797d70e75a1fbd6870531590d90d5432fbb8d001215a22bf8a6aee96802a37 Oct 14 13:43:20.683589 master-2 kubenswrapper[4762]: I1014 13:43:20.683456 4762 generic.go:334] "Generic (PLEG): container finished" podID="50e02999-7959-40a8-883c-4f5aafab78b4" containerID="b57b3ec9c87f72b26b69dcc6867f5bb1fe17f26f3c05e1c6cea57f0cf7ca898e" exitCode=0 Oct 14 13:43:20.683589 master-2 kubenswrapper[4762]: I1014 13:43:20.683513 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerDied","Data":"b57b3ec9c87f72b26b69dcc6867f5bb1fe17f26f3c05e1c6cea57f0cf7ca898e"} Oct 14 13:43:20.683589 master-2 kubenswrapper[4762]: I1014 13:43:20.683560 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerStarted","Data":"a1797d70e75a1fbd6870531590d90d5432fbb8d001215a22bf8a6aee96802a37"} Oct 14 13:43:21.587725 master-2 kubenswrapper[4762]: I1014 13:43:21.587523 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:21.589662 master-2 kubenswrapper[4762]: I1014 13:43:21.589615 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:21.589778 master-2 kubenswrapper[4762]: I1014 13:43:21.589716 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.612675 master-2 kubenswrapper[4762]: I1014 13:43:21.612602 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.613094 master-2 kubenswrapper[4762]: I1014 13:43:21.612816 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.613138 master-2 kubenswrapper[4762]: I1014 13:43:21.613096 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7lc\" (UniqueName: \"kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.695651 master-2 kubenswrapper[4762]: I1014 13:43:21.694755 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerStarted","Data":"c39da47420989a12be6f95f6e9c309ba698d21f681a1b5278c194151ca921c3f"} Oct 14 13:43:21.715064 master-2 kubenswrapper[4762]: I1014 13:43:21.714930 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.715064 master-2 kubenswrapper[4762]: I1014 13:43:21.714986 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.715409 master-2 kubenswrapper[4762]: I1014 13:43:21.715089 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7lc\" (UniqueName: \"kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.715929 master-2 kubenswrapper[4762]: I1014 13:43:21.715871 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.715929 master-2 kubenswrapper[4762]: I1014 13:43:21.715897 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.771820 master-2 kubenswrapper[4762]: I1014 13:43:21.771763 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7lc\" (UniqueName: \"kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc\") pod \"redhat-marketplace-9llqb\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:21.934852 master-2 kubenswrapper[4762]: I1014 13:43:21.934768 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:22.361105 master-2 kubenswrapper[4762]: I1014 13:43:22.361033 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:22.368901 master-2 kubenswrapper[4762]: W1014 13:43:22.368827 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77599cfc_c7cc_4510_ba4c_2e320aa60aa3.slice/crio-1f49b3fc06257536bb2ef62af2d55d42620fffd49c8ef6523e0661d3dca7f8b6 WatchSource:0}: Error finding container 1f49b3fc06257536bb2ef62af2d55d42620fffd49c8ef6523e0661d3dca7f8b6: Status 404 returned error can't find the container with id 1f49b3fc06257536bb2ef62af2d55d42620fffd49c8ef6523e0661d3dca7f8b6 Oct 14 13:43:22.712858 master-2 kubenswrapper[4762]: I1014 13:43:22.712775 4762 generic.go:334] "Generic (PLEG): container finished" podID="50e02999-7959-40a8-883c-4f5aafab78b4" containerID="c39da47420989a12be6f95f6e9c309ba698d21f681a1b5278c194151ca921c3f" exitCode=0 Oct 14 13:43:22.713544 master-2 kubenswrapper[4762]: I1014 13:43:22.712927 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerDied","Data":"c39da47420989a12be6f95f6e9c309ba698d21f681a1b5278c194151ca921c3f"} Oct 14 13:43:22.715504 master-2 kubenswrapper[4762]: I1014 13:43:22.715461 4762 generic.go:334] "Generic (PLEG): container finished" podID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerID="3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829" exitCode=0 Oct 14 13:43:22.715564 master-2 kubenswrapper[4762]: I1014 13:43:22.715500 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerDied","Data":"3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829"} Oct 14 13:43:22.715564 master-2 kubenswrapper[4762]: I1014 13:43:22.715532 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerStarted","Data":"1f49b3fc06257536bb2ef62af2d55d42620fffd49c8ef6523e0661d3dca7f8b6"} Oct 14 13:43:23.739737 master-2 kubenswrapper[4762]: I1014 13:43:23.739687 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerStarted","Data":"2ad44017b519b6d3c7a6af8ac954c23b45e212c4491374f5cd7ee1c935ac58a5"} Oct 14 13:43:23.775808 master-2 kubenswrapper[4762]: I1014 13:43:23.775730 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6dqm" podStartSLOduration=2.283024574 podStartE2EDuration="4.775708184s" podCreationTimestamp="2025-10-14 13:43:19 +0000 UTC" firstStartedPulling="2025-10-14 13:43:20.685736844 +0000 UTC m=+2229.929896043" lastFinishedPulling="2025-10-14 13:43:23.178420494 +0000 UTC m=+2232.422579653" observedRunningTime="2025-10-14 13:43:23.770023814 +0000 UTC m=+2233.014182983" watchObservedRunningTime="2025-10-14 13:43:23.775708184 +0000 UTC m=+2233.019867343" Oct 14 13:43:24.753604 master-2 kubenswrapper[4762]: I1014 13:43:24.753519 4762 generic.go:334] "Generic (PLEG): container finished" podID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerID="f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11" exitCode=0 Oct 14 13:43:24.754449 master-2 kubenswrapper[4762]: I1014 13:43:24.753607 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerDied","Data":"f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11"} Oct 14 13:43:25.765391 master-2 kubenswrapper[4762]: I1014 13:43:25.765065 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerStarted","Data":"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080"} Oct 14 13:43:25.793274 master-2 kubenswrapper[4762]: I1014 13:43:25.793143 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9llqb" podStartSLOduration=2.290048343 podStartE2EDuration="4.793117632s" podCreationTimestamp="2025-10-14 13:43:21 +0000 UTC" firstStartedPulling="2025-10-14 13:43:22.717411053 +0000 UTC m=+2231.961570252" lastFinishedPulling="2025-10-14 13:43:25.220480382 +0000 UTC m=+2234.464639541" observedRunningTime="2025-10-14 13:43:25.792299726 +0000 UTC m=+2235.036458895" watchObservedRunningTime="2025-10-14 13:43:25.793117632 +0000 UTC m=+2235.037276831" Oct 14 13:43:29.750708 master-2 kubenswrapper[4762]: I1014 13:43:29.750629 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:29.750708 master-2 kubenswrapper[4762]: I1014 13:43:29.750722 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:29.799795 master-2 kubenswrapper[4762]: I1014 13:43:29.799732 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:29.865900 master-2 kubenswrapper[4762]: I1014 13:43:29.865826 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:30.066962 master-2 kubenswrapper[4762]: I1014 13:43:30.066811 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:31.838079 master-2 kubenswrapper[4762]: I1014 13:43:31.837996 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6dqm" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="registry-server" containerID="cri-o://2ad44017b519b6d3c7a6af8ac954c23b45e212c4491374f5cd7ee1c935ac58a5" gracePeriod=2 Oct 14 13:43:31.936030 master-2 kubenswrapper[4762]: I1014 13:43:31.935936 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:31.936030 master-2 kubenswrapper[4762]: I1014 13:43:31.936006 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:31.993242 master-2 kubenswrapper[4762]: I1014 13:43:31.992794 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:32.850047 master-2 kubenswrapper[4762]: I1014 13:43:32.849893 4762 generic.go:334] "Generic (PLEG): container finished" podID="50e02999-7959-40a8-883c-4f5aafab78b4" containerID="2ad44017b519b6d3c7a6af8ac954c23b45e212c4491374f5cd7ee1c935ac58a5" exitCode=0 Oct 14 13:43:32.850047 master-2 kubenswrapper[4762]: I1014 13:43:32.849994 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerDied","Data":"2ad44017b519b6d3c7a6af8ac954c23b45e212c4491374f5cd7ee1c935ac58a5"} Oct 14 13:43:32.901013 master-2 kubenswrapper[4762]: I1014 13:43:32.900960 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:33.036242 master-2 kubenswrapper[4762]: I1014 13:43:33.036129 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:33.192252 master-2 kubenswrapper[4762]: I1014 13:43:33.192093 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content\") pod \"50e02999-7959-40a8-883c-4f5aafab78b4\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " Oct 14 13:43:33.192252 master-2 kubenswrapper[4762]: I1014 13:43:33.192214 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wsfl\" (UniqueName: \"kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl\") pod \"50e02999-7959-40a8-883c-4f5aafab78b4\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " Oct 14 13:43:33.192676 master-2 kubenswrapper[4762]: I1014 13:43:33.192377 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities\") pod \"50e02999-7959-40a8-883c-4f5aafab78b4\" (UID: \"50e02999-7959-40a8-883c-4f5aafab78b4\") " Oct 14 13:43:33.193738 master-2 kubenswrapper[4762]: I1014 13:43:33.193671 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities" (OuterVolumeSpecName: "utilities") pod "50e02999-7959-40a8-883c-4f5aafab78b4" (UID: "50e02999-7959-40a8-883c-4f5aafab78b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:43:33.197422 master-2 kubenswrapper[4762]: I1014 13:43:33.197342 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl" (OuterVolumeSpecName: "kube-api-access-6wsfl") pod "50e02999-7959-40a8-883c-4f5aafab78b4" (UID: "50e02999-7959-40a8-883c-4f5aafab78b4"). InnerVolumeSpecName "kube-api-access-6wsfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:43:33.240614 master-2 kubenswrapper[4762]: I1014 13:43:33.239510 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50e02999-7959-40a8-883c-4f5aafab78b4" (UID: "50e02999-7959-40a8-883c-4f5aafab78b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:43:33.294312 master-2 kubenswrapper[4762]: I1014 13:43:33.294241 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:33.294312 master-2 kubenswrapper[4762]: I1014 13:43:33.294291 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e02999-7959-40a8-883c-4f5aafab78b4-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:33.294312 master-2 kubenswrapper[4762]: I1014 13:43:33.294309 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wsfl\" (UniqueName: \"kubernetes.io/projected/50e02999-7959-40a8-883c-4f5aafab78b4-kube-api-access-6wsfl\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:33.663995 master-2 kubenswrapper[4762]: I1014 13:43:33.663917 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:33.873947 master-2 kubenswrapper[4762]: I1014 13:43:33.873780 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6dqm" event={"ID":"50e02999-7959-40a8-883c-4f5aafab78b4","Type":"ContainerDied","Data":"a1797d70e75a1fbd6870531590d90d5432fbb8d001215a22bf8a6aee96802a37"} Oct 14 13:43:33.873947 master-2 kubenswrapper[4762]: I1014 13:43:33.873831 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6dqm" Oct 14 13:43:33.873947 master-2 kubenswrapper[4762]: I1014 13:43:33.873901 4762 scope.go:117] "RemoveContainer" containerID="2ad44017b519b6d3c7a6af8ac954c23b45e212c4491374f5cd7ee1c935ac58a5" Oct 14 13:43:33.910189 master-2 kubenswrapper[4762]: I1014 13:43:33.910133 4762 scope.go:117] "RemoveContainer" containerID="c39da47420989a12be6f95f6e9c309ba698d21f681a1b5278c194151ca921c3f" Oct 14 13:43:33.912803 master-2 kubenswrapper[4762]: I1014 13:43:33.912699 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:33.930963 master-2 kubenswrapper[4762]: I1014 13:43:33.930895 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6dqm"] Oct 14 13:43:33.946002 master-2 kubenswrapper[4762]: I1014 13:43:33.945873 4762 scope.go:117] "RemoveContainer" containerID="b57b3ec9c87f72b26b69dcc6867f5bb1fe17f26f3c05e1c6cea57f0cf7ca898e" Oct 14 13:43:34.884771 master-2 kubenswrapper[4762]: I1014 13:43:34.884675 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9llqb" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="registry-server" containerID="cri-o://b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080" gracePeriod=2 Oct 14 13:43:35.404880 master-2 kubenswrapper[4762]: I1014 13:43:35.404785 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:35.545485 master-2 kubenswrapper[4762]: I1014 13:43:35.545445 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7lc\" (UniqueName: \"kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc\") pod \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " Oct 14 13:43:35.546035 master-2 kubenswrapper[4762]: I1014 13:43:35.546014 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities\") pod \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " Oct 14 13:43:35.546248 master-2 kubenswrapper[4762]: I1014 13:43:35.546230 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content\") pod \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\" (UID: \"77599cfc-c7cc-4510-ba4c-2e320aa60aa3\") " Oct 14 13:43:35.547234 master-2 kubenswrapper[4762]: I1014 13:43:35.547183 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities" (OuterVolumeSpecName: "utilities") pod "77599cfc-c7cc-4510-ba4c-2e320aa60aa3" (UID: "77599cfc-c7cc-4510-ba4c-2e320aa60aa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:43:35.548668 master-2 kubenswrapper[4762]: I1014 13:43:35.548639 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc" (OuterVolumeSpecName: "kube-api-access-vb7lc") pod "77599cfc-c7cc-4510-ba4c-2e320aa60aa3" (UID: "77599cfc-c7cc-4510-ba4c-2e320aa60aa3"). InnerVolumeSpecName "kube-api-access-vb7lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:43:35.560473 master-2 kubenswrapper[4762]: I1014 13:43:35.560419 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77599cfc-c7cc-4510-ba4c-2e320aa60aa3" (UID: "77599cfc-c7cc-4510-ba4c-2e320aa60aa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:43:35.563912 master-2 kubenswrapper[4762]: I1014 13:43:35.563835 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" path="/var/lib/kubelet/pods/50e02999-7959-40a8-883c-4f5aafab78b4/volumes" Oct 14 13:43:35.648821 master-2 kubenswrapper[4762]: I1014 13:43:35.648747 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:35.648821 master-2 kubenswrapper[4762]: I1014 13:43:35.648793 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7lc\" (UniqueName: \"kubernetes.io/projected/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-kube-api-access-vb7lc\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:35.648821 master-2 kubenswrapper[4762]: I1014 13:43:35.648805 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77599cfc-c7cc-4510-ba4c-2e320aa60aa3-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:43:35.897475 master-2 kubenswrapper[4762]: I1014 13:43:35.897299 4762 generic.go:334] "Generic (PLEG): container finished" podID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerID="b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080" exitCode=0 Oct 14 13:43:35.897475 master-2 kubenswrapper[4762]: I1014 13:43:35.897355 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerDied","Data":"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080"} Oct 14 13:43:35.897475 master-2 kubenswrapper[4762]: I1014 13:43:35.897383 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9llqb" event={"ID":"77599cfc-c7cc-4510-ba4c-2e320aa60aa3","Type":"ContainerDied","Data":"1f49b3fc06257536bb2ef62af2d55d42620fffd49c8ef6523e0661d3dca7f8b6"} Oct 14 13:43:35.897475 master-2 kubenswrapper[4762]: I1014 13:43:35.897400 4762 scope.go:117] "RemoveContainer" containerID="b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080" Oct 14 13:43:35.898513 master-2 kubenswrapper[4762]: I1014 13:43:35.897546 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9llqb" Oct 14 13:43:35.916884 master-2 kubenswrapper[4762]: I1014 13:43:35.916838 4762 scope.go:117] "RemoveContainer" containerID="f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11" Oct 14 13:43:35.932852 master-2 kubenswrapper[4762]: I1014 13:43:35.932801 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:35.937714 master-2 kubenswrapper[4762]: I1014 13:43:35.937678 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9llqb"] Oct 14 13:43:35.938799 master-2 kubenswrapper[4762]: I1014 13:43:35.938768 4762 scope.go:117] "RemoveContainer" containerID="3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829" Oct 14 13:43:35.969694 master-2 kubenswrapper[4762]: I1014 13:43:35.969644 4762 scope.go:117] "RemoveContainer" containerID="b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080" Oct 14 13:43:35.970182 master-2 kubenswrapper[4762]: E1014 13:43:35.970132 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080\": container with ID starting with b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080 not found: ID does not exist" containerID="b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080" Oct 14 13:43:35.970256 master-2 kubenswrapper[4762]: I1014 13:43:35.970198 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080"} err="failed to get container status \"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080\": rpc error: code = NotFound desc = could not find container \"b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080\": container with ID starting with b8cb35eed9e97620c20a61ba5bd390e62c317b8707b31555a7bf24b358624080 not found: ID does not exist" Oct 14 13:43:35.970256 master-2 kubenswrapper[4762]: I1014 13:43:35.970229 4762 scope.go:117] "RemoveContainer" containerID="f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11" Oct 14 13:43:35.970712 master-2 kubenswrapper[4762]: E1014 13:43:35.970674 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11\": container with ID starting with f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11 not found: ID does not exist" containerID="f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11" Oct 14 13:43:35.970775 master-2 kubenswrapper[4762]: I1014 13:43:35.970710 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11"} err="failed to get container status \"f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11\": rpc error: code = NotFound desc = could not find container \"f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11\": container with ID starting with f95a827c09a314c4656af48ac5076d4101ac127a639d9da7a975a7ed5585fa11 not found: ID does not exist" Oct 14 13:43:35.970775 master-2 kubenswrapper[4762]: I1014 13:43:35.970735 4762 scope.go:117] "RemoveContainer" containerID="3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829" Oct 14 13:43:35.971036 master-2 kubenswrapper[4762]: E1014 13:43:35.971009 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829\": container with ID starting with 3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829 not found: ID does not exist" containerID="3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829" Oct 14 13:43:35.971090 master-2 kubenswrapper[4762]: I1014 13:43:35.971031 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829"} err="failed to get container status \"3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829\": rpc error: code = NotFound desc = could not find container \"3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829\": container with ID starting with 3d313c22c152235845abacd304322748636432d08ea29cfb1378bb6aa9b75829 not found: ID does not exist" Oct 14 13:43:37.575145 master-2 kubenswrapper[4762]: I1014 13:43:37.575060 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" path="/var/lib/kubelet/pods/77599cfc-c7cc-4510-ba4c-2e320aa60aa3/volumes" Oct 14 13:44:27.569435 master-2 kubenswrapper[4762]: I1014 13:44:27.569355 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569808 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="registry-server" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569831 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="registry-server" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569851 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="extract-utilities" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569863 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="extract-utilities" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569890 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="registry-server" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569901 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="registry-server" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569919 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="extract-content" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569929 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="extract-content" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569957 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="extract-content" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569969 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="extract-content" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: E1014 13:44:27.569987 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="extract-utilities" Oct 14 13:44:27.570226 master-2 kubenswrapper[4762]: I1014 13:44:27.569998 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="extract-utilities" Oct 14 13:44:27.570908 master-2 kubenswrapper[4762]: I1014 13:44:27.570284 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e02999-7959-40a8-883c-4f5aafab78b4" containerName="registry-server" Oct 14 13:44:27.570908 master-2 kubenswrapper[4762]: I1014 13:44:27.570318 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="77599cfc-c7cc-4510-ba4c-2e320aa60aa3" containerName="registry-server" Oct 14 13:44:27.572353 master-2 kubenswrapper[4762]: I1014 13:44:27.572304 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.572525 master-2 kubenswrapper[4762]: I1014 13:44:27.572458 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:27.685334 master-2 kubenswrapper[4762]: I1014 13:44:27.684438 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.685334 master-2 kubenswrapper[4762]: I1014 13:44:27.684627 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.685334 master-2 kubenswrapper[4762]: I1014 13:44:27.684673 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmxh\" (UniqueName: \"kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.787575 master-2 kubenswrapper[4762]: I1014 13:44:27.786909 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.787575 master-2 kubenswrapper[4762]: I1014 13:44:27.787056 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmxh\" (UniqueName: \"kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.787575 master-2 kubenswrapper[4762]: I1014 13:44:27.787274 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.787575 master-2 kubenswrapper[4762]: I1014 13:44:27.787502 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.787945 master-2 kubenswrapper[4762]: I1014 13:44:27.787754 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.812219 master-2 kubenswrapper[4762]: I1014 13:44:27.812132 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmxh\" (UniqueName: \"kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh\") pod \"redhat-operators-6kmvg\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:27.906355 master-2 kubenswrapper[4762]: I1014 13:44:27.906122 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:28.330609 master-2 kubenswrapper[4762]: I1014 13:44:28.330551 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:28.473741 master-2 kubenswrapper[4762]: I1014 13:44:28.473675 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerStarted","Data":"6185f7ed98a68cb02796143d2a1905347927ebf8113df2639a99f5eb90b95ba4"} Oct 14 13:44:29.487002 master-2 kubenswrapper[4762]: I1014 13:44:29.486548 4762 generic.go:334] "Generic (PLEG): container finished" podID="41c21720-993e-4921-805b-a70e1f132a8f" containerID="a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604" exitCode=0 Oct 14 13:44:29.487002 master-2 kubenswrapper[4762]: I1014 13:44:29.486601 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerDied","Data":"a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604"} Oct 14 13:44:29.488831 master-2 kubenswrapper[4762]: I1014 13:44:29.488130 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:44:30.496214 master-2 kubenswrapper[4762]: I1014 13:44:30.495954 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerStarted","Data":"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7"} Oct 14 13:44:31.517514 master-2 kubenswrapper[4762]: I1014 13:44:31.517405 4762 generic.go:334] "Generic (PLEG): container finished" podID="41c21720-993e-4921-805b-a70e1f132a8f" containerID="4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7" exitCode=0 Oct 14 13:44:31.517514 master-2 kubenswrapper[4762]: I1014 13:44:31.517442 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerDied","Data":"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7"} Oct 14 13:44:32.529596 master-2 kubenswrapper[4762]: I1014 13:44:32.529524 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerStarted","Data":"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd"} Oct 14 13:44:32.557095 master-2 kubenswrapper[4762]: I1014 13:44:32.556975 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6kmvg" podStartSLOduration=2.7163627 podStartE2EDuration="5.556931033s" podCreationTimestamp="2025-10-14 13:44:27 +0000 UTC" firstStartedPulling="2025-10-14 13:44:29.488088108 +0000 UTC m=+2298.732247267" lastFinishedPulling="2025-10-14 13:44:32.328656401 +0000 UTC m=+2301.572815600" observedRunningTime="2025-10-14 13:44:32.556888301 +0000 UTC m=+2301.801047470" watchObservedRunningTime="2025-10-14 13:44:32.556931033 +0000 UTC m=+2301.801090192" Oct 14 13:44:37.907474 master-2 kubenswrapper[4762]: I1014 13:44:37.907409 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:37.907474 master-2 kubenswrapper[4762]: I1014 13:44:37.907475 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:37.974913 master-2 kubenswrapper[4762]: I1014 13:44:37.974796 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:38.687677 master-2 kubenswrapper[4762]: I1014 13:44:38.687605 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:38.773781 master-2 kubenswrapper[4762]: I1014 13:44:38.773683 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:40.626106 master-2 kubenswrapper[4762]: I1014 13:44:40.626017 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6kmvg" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="registry-server" containerID="cri-o://932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd" gracePeriod=2 Oct 14 13:44:41.224903 master-2 kubenswrapper[4762]: I1014 13:44:41.224863 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:41.302738 master-2 kubenswrapper[4762]: I1014 13:44:41.302627 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content\") pod \"41c21720-993e-4921-805b-a70e1f132a8f\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " Oct 14 13:44:41.302738 master-2 kubenswrapper[4762]: I1014 13:44:41.302731 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmxh\" (UniqueName: \"kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh\") pod \"41c21720-993e-4921-805b-a70e1f132a8f\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " Oct 14 13:44:41.303381 master-2 kubenswrapper[4762]: I1014 13:44:41.302786 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities\") pod \"41c21720-993e-4921-805b-a70e1f132a8f\" (UID: \"41c21720-993e-4921-805b-a70e1f132a8f\") " Oct 14 13:44:41.304809 master-2 kubenswrapper[4762]: I1014 13:44:41.304715 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities" (OuterVolumeSpecName: "utilities") pod "41c21720-993e-4921-805b-a70e1f132a8f" (UID: "41c21720-993e-4921-805b-a70e1f132a8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:44:41.306565 master-2 kubenswrapper[4762]: I1014 13:44:41.306454 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh" (OuterVolumeSpecName: "kube-api-access-lpmxh") pod "41c21720-993e-4921-805b-a70e1f132a8f" (UID: "41c21720-993e-4921-805b-a70e1f132a8f"). InnerVolumeSpecName "kube-api-access-lpmxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:44:41.407380 master-2 kubenswrapper[4762]: I1014 13:44:41.407231 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmxh\" (UniqueName: \"kubernetes.io/projected/41c21720-993e-4921-805b-a70e1f132a8f-kube-api-access-lpmxh\") on node \"master-2\" DevicePath \"\"" Oct 14 13:44:41.407380 master-2 kubenswrapper[4762]: I1014 13:44:41.407287 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:44:41.642193 master-2 kubenswrapper[4762]: I1014 13:44:41.641928 4762 generic.go:334] "Generic (PLEG): container finished" podID="41c21720-993e-4921-805b-a70e1f132a8f" containerID="932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd" exitCode=0 Oct 14 13:44:41.642193 master-2 kubenswrapper[4762]: I1014 13:44:41.642000 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerDied","Data":"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd"} Oct 14 13:44:41.642193 master-2 kubenswrapper[4762]: I1014 13:44:41.642040 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kmvg" event={"ID":"41c21720-993e-4921-805b-a70e1f132a8f","Type":"ContainerDied","Data":"6185f7ed98a68cb02796143d2a1905347927ebf8113df2639a99f5eb90b95ba4"} Oct 14 13:44:41.642193 master-2 kubenswrapper[4762]: I1014 13:44:41.642070 4762 scope.go:117] "RemoveContainer" containerID="932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd" Oct 14 13:44:41.642193 master-2 kubenswrapper[4762]: I1014 13:44:41.642094 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kmvg" Oct 14 13:44:41.667727 master-2 kubenswrapper[4762]: I1014 13:44:41.667679 4762 scope.go:117] "RemoveContainer" containerID="4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7" Oct 14 13:44:41.693459 master-2 kubenswrapper[4762]: I1014 13:44:41.693393 4762 scope.go:117] "RemoveContainer" containerID="a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604" Oct 14 13:44:41.727499 master-2 kubenswrapper[4762]: I1014 13:44:41.727380 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41c21720-993e-4921-805b-a70e1f132a8f" (UID: "41c21720-993e-4921-805b-a70e1f132a8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:44:41.756673 master-2 kubenswrapper[4762]: I1014 13:44:41.756613 4762 scope.go:117] "RemoveContainer" containerID="932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd" Oct 14 13:44:41.757487 master-2 kubenswrapper[4762]: E1014 13:44:41.757321 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd\": container with ID starting with 932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd not found: ID does not exist" containerID="932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd" Oct 14 13:44:41.757487 master-2 kubenswrapper[4762]: I1014 13:44:41.757357 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd"} err="failed to get container status \"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd\": rpc error: code = NotFound desc = could not find container \"932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd\": container with ID starting with 932bb592e93e31ce1756ffb5d9a4773bd978394378ab85af8afc71711c3dfadd not found: ID does not exist" Oct 14 13:44:41.757487 master-2 kubenswrapper[4762]: I1014 13:44:41.757380 4762 scope.go:117] "RemoveContainer" containerID="4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7" Oct 14 13:44:41.757771 master-2 kubenswrapper[4762]: E1014 13:44:41.757743 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7\": container with ID starting with 4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7 not found: ID does not exist" containerID="4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7" Oct 14 13:44:41.757818 master-2 kubenswrapper[4762]: I1014 13:44:41.757769 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7"} err="failed to get container status \"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7\": rpc error: code = NotFound desc = could not find container \"4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7\": container with ID starting with 4ba898230c3d55f60da0daa7bbb01a7847853c8a367177cc051d38ae7ab67ef7 not found: ID does not exist" Oct 14 13:44:41.757818 master-2 kubenswrapper[4762]: I1014 13:44:41.757787 4762 scope.go:117] "RemoveContainer" containerID="a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604" Oct 14 13:44:41.758208 master-2 kubenswrapper[4762]: E1014 13:44:41.758170 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604\": container with ID starting with a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604 not found: ID does not exist" containerID="a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604" Oct 14 13:44:41.758275 master-2 kubenswrapper[4762]: I1014 13:44:41.758219 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604"} err="failed to get container status \"a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604\": rpc error: code = NotFound desc = could not find container \"a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604\": container with ID starting with a566e15be95aa14d79d3e5c078a061e7e38813273f33b5373f83bc3de61cc604 not found: ID does not exist" Oct 14 13:44:41.815409 master-2 kubenswrapper[4762]: I1014 13:44:41.815375 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41c21720-993e-4921-805b-a70e1f132a8f-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:44:41.992416 master-2 kubenswrapper[4762]: I1014 13:44:41.992330 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:42.000838 master-2 kubenswrapper[4762]: I1014 13:44:42.000767 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6kmvg"] Oct 14 13:44:43.561199 master-2 kubenswrapper[4762]: I1014 13:44:43.561134 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c21720-993e-4921-805b-a70e1f132a8f" path="/var/lib/kubelet/pods/41c21720-993e-4921-805b-a70e1f132a8f/volumes" Oct 14 13:45:23.646747 master-2 kubenswrapper[4762]: I1014 13:45:23.646643 4762 scope.go:117] "RemoveContainer" containerID="8516f98c675a5c32a65d2bd44f9940e17737ff631c3fde28736f4f99a9b0c785" Oct 14 13:45:23.667719 master-2 kubenswrapper[4762]: I1014 13:45:23.667666 4762 scope.go:117] "RemoveContainer" containerID="8c74b9cb4e81e06c7bc3e87e417dff14d054f3fa9608dd3c961dcd984f6606d4" Oct 14 13:45:23.700537 master-2 kubenswrapper[4762]: I1014 13:45:23.692966 4762 scope.go:117] "RemoveContainer" containerID="dbdb877b21f62762542d2f5e2033d37870ecfba58c2ff369e4a2ffb0757e4cbd" Oct 14 13:45:23.716553 master-2 kubenswrapper[4762]: I1014 13:45:23.716512 4762 scope.go:117] "RemoveContainer" containerID="40c06e905ae4f03fbd29a5fd8e06fbeab9259713185bad5de2e65956d5103423" Oct 14 13:46:15.079050 master-2 kubenswrapper[4762]: I1014 13:46:15.078874 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bkjjv"] Oct 14 13:46:15.088884 master-2 kubenswrapper[4762]: I1014 13:46:15.088816 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bkjjv"] Oct 14 13:46:15.564010 master-2 kubenswrapper[4762]: I1014 13:46:15.563906 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf99a3bf-4a2a-4206-9538-24b47ebd5605" path="/var/lib/kubelet/pods/cf99a3bf-4a2a-4206-9538-24b47ebd5605/volumes" Oct 14 13:46:23.789108 master-2 kubenswrapper[4762]: I1014 13:46:23.789026 4762 scope.go:117] "RemoveContainer" containerID="5b4ec1eb2146271a8dfe20d432fe3931ffeaf6fb4f69a9a9bf901388c828b53e" Oct 14 13:46:23.824105 master-2 kubenswrapper[4762]: I1014 13:46:23.823998 4762 scope.go:117] "RemoveContainer" containerID="fb20d2acb4e9717ddb43b5dd802458df143a653b646009ed3e2a84064763486f" Oct 14 13:46:23.877222 master-2 kubenswrapper[4762]: I1014 13:46:23.877179 4762 scope.go:117] "RemoveContainer" containerID="10213f95de1e9c66e9f277911356ed1f4f72178c0f58354fce1f6bd9357573e2" Oct 14 13:46:29.811851 master-2 kubenswrapper[4762]: I1014 13:46:29.811803 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: E1014 13:46:29.812110 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="extract-utilities" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: I1014 13:46:29.812123 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="extract-utilities" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: E1014 13:46:29.812178 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="registry-server" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: I1014 13:46:29.812184 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="registry-server" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: E1014 13:46:29.812202 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="extract-content" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: I1014 13:46:29.812208 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="extract-content" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: I1014 13:46:29.812364 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c21720-993e-4921-805b-a70e1f132a8f" containerName="registry-server" Oct 14 13:46:29.814806 master-2 kubenswrapper[4762]: I1014 13:46:29.813617 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.830355 master-2 kubenswrapper[4762]: I1014 13:46:29.830051 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:29.877711 master-2 kubenswrapper[4762]: I1014 13:46:29.877625 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.878011 master-2 kubenswrapper[4762]: I1014 13:46:29.877721 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.878011 master-2 kubenswrapper[4762]: I1014 13:46:29.877755 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqnk\" (UniqueName: \"kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.979604 master-2 kubenswrapper[4762]: I1014 13:46:29.979462 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.979604 master-2 kubenswrapper[4762]: I1014 13:46:29.979535 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.979604 master-2 kubenswrapper[4762]: I1014 13:46:29.979634 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqnk\" (UniqueName: \"kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.980712 master-2 kubenswrapper[4762]: I1014 13:46:29.980638 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:29.981578 master-2 kubenswrapper[4762]: I1014 13:46:29.981533 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:30.010223 master-2 kubenswrapper[4762]: I1014 13:46:30.010168 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqnk\" (UniqueName: \"kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk\") pod \"certified-operators-75z82\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:30.141685 master-2 kubenswrapper[4762]: I1014 13:46:30.141505 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:30.625412 master-2 kubenswrapper[4762]: I1014 13:46:30.625354 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:30.837211 master-2 kubenswrapper[4762]: I1014 13:46:30.835772 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerStarted","Data":"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8"} Oct 14 13:46:30.837211 master-2 kubenswrapper[4762]: I1014 13:46:30.835846 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerStarted","Data":"241243b474071c5017c896b6b20928d2d90982d4b73becf5b49f5090b513814c"} Oct 14 13:46:31.852360 master-2 kubenswrapper[4762]: I1014 13:46:31.852233 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerID="7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8" exitCode=0 Oct 14 13:46:31.852360 master-2 kubenswrapper[4762]: I1014 13:46:31.852322 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerDied","Data":"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8"} Oct 14 13:46:33.873836 master-2 kubenswrapper[4762]: I1014 13:46:33.873760 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerID="de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316" exitCode=0 Oct 14 13:46:33.874532 master-2 kubenswrapper[4762]: I1014 13:46:33.873849 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerDied","Data":"de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316"} Oct 14 13:46:34.890892 master-2 kubenswrapper[4762]: I1014 13:46:34.890802 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerStarted","Data":"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484"} Oct 14 13:46:34.933578 master-2 kubenswrapper[4762]: I1014 13:46:34.933494 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-75z82" podStartSLOduration=3.4766857079999998 podStartE2EDuration="5.933473771s" podCreationTimestamp="2025-10-14 13:46:29 +0000 UTC" firstStartedPulling="2025-10-14 13:46:31.855084898 +0000 UTC m=+2421.099244107" lastFinishedPulling="2025-10-14 13:46:34.311873001 +0000 UTC m=+2423.556032170" observedRunningTime="2025-10-14 13:46:34.92296517 +0000 UTC m=+2424.167124329" watchObservedRunningTime="2025-10-14 13:46:34.933473771 +0000 UTC m=+2424.177632940" Oct 14 13:46:35.077999 master-2 kubenswrapper[4762]: I1014 13:46:35.077887 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-g5vgq"] Oct 14 13:46:35.087605 master-2 kubenswrapper[4762]: I1014 13:46:35.087496 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-g5vgq"] Oct 14 13:46:35.559799 master-2 kubenswrapper[4762]: I1014 13:46:35.559714 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799545f6-d5b9-428c-96a7-96aa931ed940" path="/var/lib/kubelet/pods/799545f6-d5b9-428c-96a7-96aa931ed940/volumes" Oct 14 13:46:40.142950 master-2 kubenswrapper[4762]: I1014 13:46:40.142722 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:40.142950 master-2 kubenswrapper[4762]: I1014 13:46:40.142811 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:40.209142 master-2 kubenswrapper[4762]: I1014 13:46:40.209071 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:41.029960 master-2 kubenswrapper[4762]: I1014 13:46:41.029898 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:41.123318 master-2 kubenswrapper[4762]: I1014 13:46:41.123251 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:42.983487 master-2 kubenswrapper[4762]: I1014 13:46:42.983395 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-75z82" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="registry-server" containerID="cri-o://515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484" gracePeriod=2 Oct 14 13:46:43.634126 master-2 kubenswrapper[4762]: I1014 13:46:43.634057 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:43.715458 master-2 kubenswrapper[4762]: I1014 13:46:43.715328 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqnk\" (UniqueName: \"kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk\") pod \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " Oct 14 13:46:43.716239 master-2 kubenswrapper[4762]: I1014 13:46:43.715602 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content\") pod \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " Oct 14 13:46:43.716239 master-2 kubenswrapper[4762]: I1014 13:46:43.715712 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities\") pod \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\" (UID: \"d0d2b483-2a00-45b7-aa06-1f884cc152b5\") " Oct 14 13:46:43.721852 master-2 kubenswrapper[4762]: I1014 13:46:43.719050 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities" (OuterVolumeSpecName: "utilities") pod "d0d2b483-2a00-45b7-aa06-1f884cc152b5" (UID: "d0d2b483-2a00-45b7-aa06-1f884cc152b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:46:43.721852 master-2 kubenswrapper[4762]: I1014 13:46:43.719942 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:46:43.721852 master-2 kubenswrapper[4762]: I1014 13:46:43.721716 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk" (OuterVolumeSpecName: "kube-api-access-bhqnk") pod "d0d2b483-2a00-45b7-aa06-1f884cc152b5" (UID: "d0d2b483-2a00-45b7-aa06-1f884cc152b5"). InnerVolumeSpecName "kube-api-access-bhqnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:46:43.781469 master-2 kubenswrapper[4762]: I1014 13:46:43.781336 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0d2b483-2a00-45b7-aa06-1f884cc152b5" (UID: "d0d2b483-2a00-45b7-aa06-1f884cc152b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:46:43.822522 master-2 kubenswrapper[4762]: I1014 13:46:43.822378 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d2b483-2a00-45b7-aa06-1f884cc152b5-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:46:43.822522 master-2 kubenswrapper[4762]: I1014 13:46:43.822430 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqnk\" (UniqueName: \"kubernetes.io/projected/d0d2b483-2a00-45b7-aa06-1f884cc152b5-kube-api-access-bhqnk\") on node \"master-2\" DevicePath \"\"" Oct 14 13:46:44.000065 master-2 kubenswrapper[4762]: I1014 13:46:44.000004 4762 generic.go:334] "Generic (PLEG): container finished" podID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerID="515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484" exitCode=0 Oct 14 13:46:44.000065 master-2 kubenswrapper[4762]: I1014 13:46:44.000060 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerDied","Data":"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484"} Oct 14 13:46:44.000922 master-2 kubenswrapper[4762]: I1014 13:46:44.000092 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75z82" event={"ID":"d0d2b483-2a00-45b7-aa06-1f884cc152b5","Type":"ContainerDied","Data":"241243b474071c5017c896b6b20928d2d90982d4b73becf5b49f5090b513814c"} Oct 14 13:46:44.000922 master-2 kubenswrapper[4762]: I1014 13:46:44.000113 4762 scope.go:117] "RemoveContainer" containerID="515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484" Oct 14 13:46:44.001225 master-2 kubenswrapper[4762]: I1014 13:46:44.001184 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75z82" Oct 14 13:46:44.033727 master-2 kubenswrapper[4762]: I1014 13:46:44.033673 4762 scope.go:117] "RemoveContainer" containerID="de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316" Oct 14 13:46:44.061178 master-2 kubenswrapper[4762]: I1014 13:46:44.061099 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:44.069835 master-2 kubenswrapper[4762]: I1014 13:46:44.069759 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-75z82"] Oct 14 13:46:44.070585 master-2 kubenswrapper[4762]: I1014 13:46:44.070534 4762 scope.go:117] "RemoveContainer" containerID="7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8" Oct 14 13:46:44.109704 master-2 kubenswrapper[4762]: I1014 13:46:44.109659 4762 scope.go:117] "RemoveContainer" containerID="515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484" Oct 14 13:46:44.110490 master-2 kubenswrapper[4762]: E1014 13:46:44.110442 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484\": container with ID starting with 515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484 not found: ID does not exist" containerID="515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484" Oct 14 13:46:44.110644 master-2 kubenswrapper[4762]: I1014 13:46:44.110496 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484"} err="failed to get container status \"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484\": rpc error: code = NotFound desc = could not find container \"515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484\": container with ID starting with 515748e003813cfbc95449f60fad85913ffa3ae8e683316096f6a3472fa13484 not found: ID does not exist" Oct 14 13:46:44.110644 master-2 kubenswrapper[4762]: I1014 13:46:44.110527 4762 scope.go:117] "RemoveContainer" containerID="de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316" Oct 14 13:46:44.110898 master-2 kubenswrapper[4762]: E1014 13:46:44.110856 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316\": container with ID starting with de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316 not found: ID does not exist" containerID="de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316" Oct 14 13:46:44.110898 master-2 kubenswrapper[4762]: I1014 13:46:44.110878 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316"} err="failed to get container status \"de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316\": rpc error: code = NotFound desc = could not find container \"de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316\": container with ID starting with de6fe4d5632c85223eb68489c47e8d7d2ef1c62c0c1c5c2b2ec1c68d1cbc7316 not found: ID does not exist" Oct 14 13:46:44.110898 master-2 kubenswrapper[4762]: I1014 13:46:44.110891 4762 scope.go:117] "RemoveContainer" containerID="7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8" Oct 14 13:46:44.111240 master-2 kubenswrapper[4762]: E1014 13:46:44.111204 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8\": container with ID starting with 7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8 not found: ID does not exist" containerID="7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8" Oct 14 13:46:44.111344 master-2 kubenswrapper[4762]: I1014 13:46:44.111246 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8"} err="failed to get container status \"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8\": rpc error: code = NotFound desc = could not find container \"7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8\": container with ID starting with 7b73582454aa55350cc449212e1082d5b711f69eb83d666650289d3fe7caf6c8 not found: ID does not exist" Oct 14 13:46:45.560053 master-2 kubenswrapper[4762]: I1014 13:46:45.559843 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" path="/var/lib/kubelet/pods/d0d2b483-2a00-45b7-aa06-1f884cc152b5/volumes" Oct 14 13:47:23.947449 master-2 kubenswrapper[4762]: I1014 13:47:23.947373 4762 scope.go:117] "RemoveContainer" containerID="a74426201541067f0f49cc6a155bebc5ae88678dbd1e141645381389a47e7563" Oct 14 13:47:23.991340 master-2 kubenswrapper[4762]: I1014 13:47:23.991299 4762 scope.go:117] "RemoveContainer" containerID="280aab5216733f8c50d357878f94bee2a7447933626bc9870ea4aa1c9fac74a5" Oct 14 13:47:35.047826 master-2 kubenswrapper[4762]: I1014 13:47:35.047755 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6vt26"] Oct 14 13:47:35.058071 master-2 kubenswrapper[4762]: I1014 13:47:35.058020 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6vt26"] Oct 14 13:47:35.562803 master-2 kubenswrapper[4762]: I1014 13:47:35.562719 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5987f041-7e28-466c-aa62-8901381b8413" path="/var/lib/kubelet/pods/5987f041-7e28-466c-aa62-8901381b8413/volumes" Oct 14 13:47:44.171648 master-2 kubenswrapper[4762]: I1014 13:47:44.171451 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-551b-account-create-2jk4f"] Oct 14 13:47:44.228534 master-2 kubenswrapper[4762]: I1014 13:47:44.228436 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-551b-account-create-2jk4f"] Oct 14 13:47:45.559061 master-2 kubenswrapper[4762]: I1014 13:47:45.558896 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ada32c3-34c6-47a3-858e-7e92754e719b" path="/var/lib/kubelet/pods/6ada32c3-34c6-47a3-858e-7e92754e719b/volumes" Oct 14 13:47:46.064780 master-2 kubenswrapper[4762]: I1014 13:47:46.064598 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3163-account-create-lprsc"] Oct 14 13:47:46.077034 master-2 kubenswrapper[4762]: I1014 13:47:46.076939 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3163-account-create-lprsc"] Oct 14 13:47:46.086283 master-2 kubenswrapper[4762]: I1014 13:47:46.086192 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3359-account-create-c7gkw"] Oct 14 13:47:46.106763 master-2 kubenswrapper[4762]: I1014 13:47:46.106643 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3359-account-create-c7gkw"] Oct 14 13:47:47.560208 master-2 kubenswrapper[4762]: I1014 13:47:47.560108 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c406ad-d002-46f3-9014-31258d21a113" path="/var/lib/kubelet/pods/17c406ad-d002-46f3-9014-31258d21a113/volumes" Oct 14 13:47:47.561370 master-2 kubenswrapper[4762]: I1014 13:47:47.561322 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20eee57-43a0-48ae-be88-57727baa5e8d" path="/var/lib/kubelet/pods/b20eee57-43a0-48ae-be88-57727baa5e8d/volumes" Oct 14 13:48:15.076716 master-2 kubenswrapper[4762]: I1014 13:48:15.076605 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shgdv"] Oct 14 13:48:15.091425 master-2 kubenswrapper[4762]: I1014 13:48:15.089532 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-shgdv"] Oct 14 13:48:15.560135 master-2 kubenswrapper[4762]: I1014 13:48:15.560055 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ecfc00-c91d-4ca3-93af-8897072115cc" path="/var/lib/kubelet/pods/41ecfc00-c91d-4ca3-93af-8897072115cc/volumes" Oct 14 13:48:24.093185 master-2 kubenswrapper[4762]: I1014 13:48:24.093061 4762 scope.go:117] "RemoveContainer" containerID="9ec45cda7dc1b8ec193a280be686462387e731fd2aca13027de3b37f908dbcb8" Oct 14 13:48:24.178316 master-2 kubenswrapper[4762]: I1014 13:48:24.176116 4762 scope.go:117] "RemoveContainer" containerID="1fb4cc4c07747df4f00ccf2044b34f83fe35dc07577f4c8e9e6deebe643d2cd1" Oct 14 13:48:24.214370 master-2 kubenswrapper[4762]: I1014 13:48:24.214291 4762 scope.go:117] "RemoveContainer" containerID="fe99be248c3c62595be624f8732456079e585271421a6c07660d518d9747d017" Oct 14 13:48:24.252369 master-2 kubenswrapper[4762]: I1014 13:48:24.252314 4762 scope.go:117] "RemoveContainer" containerID="683928058a90ab666720fc7169f961fb3000034765998957ace8e1708bfb207b" Oct 14 13:48:24.285042 master-2 kubenswrapper[4762]: I1014 13:48:24.284978 4762 scope.go:117] "RemoveContainer" containerID="2e81ae49f5c80e146293c83b36f208f31b6ffb3da4ca7fe8b58aa1fb9d5fd979" Oct 14 13:48:27.057584 master-2 kubenswrapper[4762]: I1014 13:48:27.057506 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-pkrrv"] Oct 14 13:48:27.065086 master-2 kubenswrapper[4762]: I1014 13:48:27.065020 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-pkrrv"] Oct 14 13:48:27.559745 master-2 kubenswrapper[4762]: I1014 13:48:27.559686 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e8aee3-10b8-4420-bacc-83d4d9e9e205" path="/var/lib/kubelet/pods/88e8aee3-10b8-4420-bacc-83d4d9e9e205/volumes" Oct 14 13:48:34.060402 master-2 kubenswrapper[4762]: I1014 13:48:34.060325 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-99b9d"] Oct 14 13:48:34.066015 master-2 kubenswrapper[4762]: I1014 13:48:34.065954 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-99b9d"] Oct 14 13:48:35.565936 master-2 kubenswrapper[4762]: I1014 13:48:35.565863 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea4e416-c6ee-4940-a9f8-2b2265d16336" path="/var/lib/kubelet/pods/5ea4e416-c6ee-4940-a9f8-2b2265d16336/volumes" Oct 14 13:48:39.062230 master-2 kubenswrapper[4762]: I1014 13:48:39.062119 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2knq"] Oct 14 13:48:39.073686 master-2 kubenswrapper[4762]: I1014 13:48:39.073609 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2knq"] Oct 14 13:48:39.563807 master-2 kubenswrapper[4762]: I1014 13:48:39.563747 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bc42a1-2444-47ab-8922-267ae995d2cc" path="/var/lib/kubelet/pods/e6bc42a1-2444-47ab-8922-267ae995d2cc/volumes" Oct 14 13:49:24.411384 master-2 kubenswrapper[4762]: I1014 13:49:24.411321 4762 scope.go:117] "RemoveContainer" containerID="641aabad2340dab2ce0051c53eef6dcf2a4b14b69c84d35539c817f74acfb1ac" Oct 14 13:49:24.468764 master-2 kubenswrapper[4762]: I1014 13:49:24.468716 4762 scope.go:117] "RemoveContainer" containerID="e9867cecc44d38717730048ad6937f254f9da5ca955e38dc5fa42ae8488d1044" Oct 14 13:49:24.525528 master-2 kubenswrapper[4762]: I1014 13:49:24.525432 4762 scope.go:117] "RemoveContainer" containerID="dc7715cef141a98b2ad6e2582578b153414d01e201310b6d25f0c662df4f5a0d" Oct 14 13:49:32.084558 master-2 kubenswrapper[4762]: I1014 13:49:32.084483 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-v24fq"] Oct 14 13:49:32.092642 master-2 kubenswrapper[4762]: I1014 13:49:32.092570 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-v24fq"] Oct 14 13:49:33.560600 master-2 kubenswrapper[4762]: I1014 13:49:33.560518 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d692e8-e19a-475b-bc4e-f22508073ffa" path="/var/lib/kubelet/pods/25d692e8-e19a-475b-bc4e-f22508073ffa/volumes" Oct 14 13:49:34.043894 master-2 kubenswrapper[4762]: I1014 13:49:34.043795 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ffpkk"] Oct 14 13:49:34.051564 master-2 kubenswrapper[4762]: I1014 13:49:34.051514 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ffpkk"] Oct 14 13:49:35.559591 master-2 kubenswrapper[4762]: I1014 13:49:35.559509 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2b1f8b-432d-4e7d-a538-28068e9e0bb6" path="/var/lib/kubelet/pods/bb2b1f8b-432d-4e7d-a538-28068e9e0bb6/volumes" Oct 14 13:50:24.647000 master-2 kubenswrapper[4762]: I1014 13:50:24.646943 4762 scope.go:117] "RemoveContainer" containerID="cf540aae01ce2e4d6adf94d12e78961660e6e278cfd2d2b091b4dd99a247a131" Oct 14 13:50:24.714405 master-2 kubenswrapper[4762]: I1014 13:50:24.712997 4762 scope.go:117] "RemoveContainer" containerID="827df6fbff6c4750d82e51a1950468ed649de141480999ca39cea24a5908f5a0" Oct 14 13:50:30.111853 master-2 kubenswrapper[4762]: I1014 13:50:30.111734 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-nr9l7"] Oct 14 13:50:30.125914 master-2 kubenswrapper[4762]: I1014 13:50:30.125577 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-nr9l7"] Oct 14 13:50:31.560691 master-2 kubenswrapper[4762]: I1014 13:50:31.560599 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed550b4-d873-4c4d-888b-393c8198e192" path="/var/lib/kubelet/pods/eed550b4-d873-4c4d-888b-393c8198e192/volumes" Oct 14 13:50:41.069515 master-2 kubenswrapper[4762]: I1014 13:50:41.069454 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c846-account-create-svw8b"] Oct 14 13:50:41.078270 master-2 kubenswrapper[4762]: I1014 13:50:41.077927 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c846-account-create-svw8b"] Oct 14 13:50:41.574320 master-2 kubenswrapper[4762]: I1014 13:50:41.574102 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680e6d7c-db69-400d-ae0d-48b947c3f9ce" path="/var/lib/kubelet/pods/680e6d7c-db69-400d-ae0d-48b947c3f9ce/volumes" Oct 14 13:50:47.078720 master-2 kubenswrapper[4762]: I1014 13:50:47.078614 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-kk5cv"] Oct 14 13:50:47.090250 master-2 kubenswrapper[4762]: I1014 13:50:47.090184 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-kk5cv"] Oct 14 13:50:47.564277 master-2 kubenswrapper[4762]: I1014 13:50:47.564140 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b793f469-b40b-47d2-a91c-5e8d4e9df87e" path="/var/lib/kubelet/pods/b793f469-b40b-47d2-a91c-5e8d4e9df87e/volumes" Oct 14 13:50:57.048336 master-2 kubenswrapper[4762]: I1014 13:50:57.048259 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-094b-account-create-82gt8"] Oct 14 13:50:57.055225 master-2 kubenswrapper[4762]: I1014 13:50:57.055087 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-094b-account-create-82gt8"] Oct 14 13:50:57.559072 master-2 kubenswrapper[4762]: I1014 13:50:57.559001 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8ac38d-e49d-4b09-a38c-0307bffbf226" path="/var/lib/kubelet/pods/8a8ac38d-e49d-4b09-a38c-0307bffbf226/volumes" Oct 14 13:51:24.818116 master-2 kubenswrapper[4762]: I1014 13:51:24.818044 4762 scope.go:117] "RemoveContainer" containerID="f2ba0ea80781847190c2246217570064f177caf036808698d8ece2f4a8274784" Oct 14 13:51:24.842270 master-2 kubenswrapper[4762]: I1014 13:51:24.842228 4762 scope.go:117] "RemoveContainer" containerID="d48be3acb914a9f94b263a422ff616d7d49a67feef08588b866f49fad73c772c" Oct 14 13:51:24.881983 master-2 kubenswrapper[4762]: I1014 13:51:24.881918 4762 scope.go:117] "RemoveContainer" containerID="d766502aef6374b3a21849b1f44b7342550719c76e537ea977011993eb5d3379" Oct 14 13:51:24.912075 master-2 kubenswrapper[4762]: I1014 13:51:24.912006 4762 scope.go:117] "RemoveContainer" containerID="338d30a71818f78d0b693c8bea7531f50bff904cec21d1a99cf096d8462a25e6" Oct 14 13:51:42.082772 master-2 kubenswrapper[4762]: I1014 13:51:42.082661 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-r24zb"] Oct 14 13:51:42.094414 master-2 kubenswrapper[4762]: I1014 13:51:42.094326 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-r24zb"] Oct 14 13:51:43.563919 master-2 kubenswrapper[4762]: I1014 13:51:43.563827 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e89c7d-f953-4fa3-95af-2abba3a06439" path="/var/lib/kubelet/pods/55e89c7d-f953-4fa3-95af-2abba3a06439/volumes" Oct 14 13:52:25.023933 master-2 kubenswrapper[4762]: I1014 13:52:25.023858 4762 scope.go:117] "RemoveContainer" containerID="a74c81431913c6f9f5e45c4fb724bdd236fda3473562ecb486a8725ea8ac0e23" Oct 14 13:52:25.061663 master-2 kubenswrapper[4762]: I1014 13:52:25.061585 4762 scope.go:117] "RemoveContainer" containerID="abde3099d7ce6ea462d0be2ae870df7f40c0a6f23f58bc92df4c9d971cc6bd53" Oct 14 13:53:25.768849 master-2 kubenswrapper[4762]: I1014 13:53:25.768761 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: E1014 13:53:25.769227 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="registry-server" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: I1014 13:53:25.769243 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="registry-server" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: E1014 13:53:25.769265 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="extract-utilities" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: I1014 13:53:25.769272 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="extract-utilities" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: E1014 13:53:25.769312 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="extract-content" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: I1014 13:53:25.769319 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="extract-content" Oct 14 13:53:25.770120 master-2 kubenswrapper[4762]: I1014 13:53:25.769522 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d2b483-2a00-45b7-aa06-1f884cc152b5" containerName="registry-server" Oct 14 13:53:25.771459 master-2 kubenswrapper[4762]: I1014 13:53:25.771421 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:25.806432 master-2 kubenswrapper[4762]: I1014 13:53:25.806349 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:25.936362 master-2 kubenswrapper[4762]: I1014 13:53:25.936295 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:25.936362 master-2 kubenswrapper[4762]: I1014 13:53:25.936368 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:25.936720 master-2 kubenswrapper[4762]: I1014 13:53:25.936472 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b449h\" (UniqueName: \"kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.039269 master-2 kubenswrapper[4762]: I1014 13:53:26.039078 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.039269 master-2 kubenswrapper[4762]: I1014 13:53:26.039199 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.039572 master-2 kubenswrapper[4762]: I1014 13:53:26.039354 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b449h\" (UniqueName: \"kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.040491 master-2 kubenswrapper[4762]: I1014 13:53:26.040453 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.040566 master-2 kubenswrapper[4762]: I1014 13:53:26.040449 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.064430 master-2 kubenswrapper[4762]: I1014 13:53:26.064364 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b449h\" (UniqueName: \"kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h\") pod \"community-operators-jdhvr\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.126194 master-2 kubenswrapper[4762]: I1014 13:53:26.123824 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:26.632330 master-2 kubenswrapper[4762]: I1014 13:53:26.632271 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:27.262534 master-2 kubenswrapper[4762]: I1014 13:53:27.262423 4762 generic.go:334] "Generic (PLEG): container finished" podID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerID="c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19" exitCode=0 Oct 14 13:53:27.262534 master-2 kubenswrapper[4762]: I1014 13:53:27.262499 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerDied","Data":"c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19"} Oct 14 13:53:27.263581 master-2 kubenswrapper[4762]: I1014 13:53:27.262597 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerStarted","Data":"1b85260a1f9687e07492d3c441ace694c94e5ffde683d809f344e16dba10bfde"} Oct 14 13:53:27.266001 master-2 kubenswrapper[4762]: I1014 13:53:27.265930 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 13:53:28.273506 master-2 kubenswrapper[4762]: I1014 13:53:28.273303 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerStarted","Data":"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c"} Oct 14 13:53:29.286946 master-2 kubenswrapper[4762]: I1014 13:53:29.286870 4762 generic.go:334] "Generic (PLEG): container finished" podID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerID="77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c" exitCode=0 Oct 14 13:53:29.287663 master-2 kubenswrapper[4762]: I1014 13:53:29.286954 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerDied","Data":"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c"} Oct 14 13:53:31.307353 master-2 kubenswrapper[4762]: I1014 13:53:31.307291 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerStarted","Data":"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2"} Oct 14 13:53:31.339802 master-2 kubenswrapper[4762]: I1014 13:53:31.339687 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jdhvr" podStartSLOduration=3.534107489 podStartE2EDuration="6.339664758s" podCreationTimestamp="2025-10-14 13:53:25 +0000 UTC" firstStartedPulling="2025-10-14 13:53:27.265815479 +0000 UTC m=+2836.509974678" lastFinishedPulling="2025-10-14 13:53:30.071372748 +0000 UTC m=+2839.315531947" observedRunningTime="2025-10-14 13:53:31.337921442 +0000 UTC m=+2840.582080621" watchObservedRunningTime="2025-10-14 13:53:31.339664758 +0000 UTC m=+2840.583823927" Oct 14 13:53:36.125729 master-2 kubenswrapper[4762]: I1014 13:53:36.125471 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:36.126393 master-2 kubenswrapper[4762]: I1014 13:53:36.125765 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:36.193746 master-2 kubenswrapper[4762]: I1014 13:53:36.193667 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:36.404900 master-2 kubenswrapper[4762]: I1014 13:53:36.404731 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:36.502074 master-2 kubenswrapper[4762]: I1014 13:53:36.501985 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:38.380818 master-2 kubenswrapper[4762]: I1014 13:53:38.380742 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jdhvr" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="registry-server" containerID="cri-o://111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2" gracePeriod=2 Oct 14 13:53:39.329468 master-2 kubenswrapper[4762]: I1014 13:53:39.329410 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:39.392396 master-2 kubenswrapper[4762]: I1014 13:53:39.392319 4762 generic.go:334] "Generic (PLEG): container finished" podID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerID="111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2" exitCode=0 Oct 14 13:53:39.392396 master-2 kubenswrapper[4762]: I1014 13:53:39.392386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerDied","Data":"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2"} Oct 14 13:53:39.393044 master-2 kubenswrapper[4762]: I1014 13:53:39.392433 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhvr" event={"ID":"5b0dd365-7a39-4923-b716-4d5121c1569f","Type":"ContainerDied","Data":"1b85260a1f9687e07492d3c441ace694c94e5ffde683d809f344e16dba10bfde"} Oct 14 13:53:39.393044 master-2 kubenswrapper[4762]: I1014 13:53:39.392458 4762 scope.go:117] "RemoveContainer" containerID="111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2" Oct 14 13:53:39.393044 master-2 kubenswrapper[4762]: I1014 13:53:39.392504 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhvr" Oct 14 13:53:39.423496 master-2 kubenswrapper[4762]: I1014 13:53:39.423444 4762 scope.go:117] "RemoveContainer" containerID="77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c" Oct 14 13:53:39.452470 master-2 kubenswrapper[4762]: I1014 13:53:39.452391 4762 scope.go:117] "RemoveContainer" containerID="c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19" Oct 14 13:53:39.489796 master-2 kubenswrapper[4762]: I1014 13:53:39.489753 4762 scope.go:117] "RemoveContainer" containerID="111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2" Oct 14 13:53:39.490669 master-2 kubenswrapper[4762]: E1014 13:53:39.490576 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2\": container with ID starting with 111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2 not found: ID does not exist" containerID="111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2" Oct 14 13:53:39.490669 master-2 kubenswrapper[4762]: I1014 13:53:39.490634 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2"} err="failed to get container status \"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2\": rpc error: code = NotFound desc = could not find container \"111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2\": container with ID starting with 111042c5d981bc0c883c3941c38794e36756ffc78b3f46b2b105ec368d7cf1b2 not found: ID does not exist" Oct 14 13:53:39.490669 master-2 kubenswrapper[4762]: I1014 13:53:39.490667 4762 scope.go:117] "RemoveContainer" containerID="77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c" Oct 14 13:53:39.492582 master-2 kubenswrapper[4762]: E1014 13:53:39.492541 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c\": container with ID starting with 77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c not found: ID does not exist" containerID="77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c" Oct 14 13:53:39.492671 master-2 kubenswrapper[4762]: I1014 13:53:39.492571 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c"} err="failed to get container status \"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c\": rpc error: code = NotFound desc = could not find container \"77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c\": container with ID starting with 77da418864defdc1de9ea38582c6c00a73df92f0eb06c627d4d52d77dbe3d84c not found: ID does not exist" Oct 14 13:53:39.492671 master-2 kubenswrapper[4762]: I1014 13:53:39.492600 4762 scope.go:117] "RemoveContainer" containerID="c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19" Oct 14 13:53:39.493244 master-2 kubenswrapper[4762]: E1014 13:53:39.493182 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19\": container with ID starting with c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19 not found: ID does not exist" containerID="c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19" Oct 14 13:53:39.493320 master-2 kubenswrapper[4762]: I1014 13:53:39.493259 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19"} err="failed to get container status \"c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19\": rpc error: code = NotFound desc = could not find container \"c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19\": container with ID starting with c157e327ea98eb2327c4f6399809b9a63e294b03eecfd290903446b020b2fa19 not found: ID does not exist" Oct 14 13:53:39.496072 master-2 kubenswrapper[4762]: I1014 13:53:39.496047 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities\") pod \"5b0dd365-7a39-4923-b716-4d5121c1569f\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " Oct 14 13:53:39.496231 master-2 kubenswrapper[4762]: I1014 13:53:39.496216 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b449h\" (UniqueName: \"kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h\") pod \"5b0dd365-7a39-4923-b716-4d5121c1569f\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " Oct 14 13:53:39.496329 master-2 kubenswrapper[4762]: I1014 13:53:39.496317 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content\") pod \"5b0dd365-7a39-4923-b716-4d5121c1569f\" (UID: \"5b0dd365-7a39-4923-b716-4d5121c1569f\") " Oct 14 13:53:39.500461 master-2 kubenswrapper[4762]: I1014 13:53:39.497676 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities" (OuterVolumeSpecName: "utilities") pod "5b0dd365-7a39-4923-b716-4d5121c1569f" (UID: "5b0dd365-7a39-4923-b716-4d5121c1569f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:53:39.500889 master-2 kubenswrapper[4762]: I1014 13:53:39.500836 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:53:39.501966 master-2 kubenswrapper[4762]: I1014 13:53:39.501884 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h" (OuterVolumeSpecName: "kube-api-access-b449h") pod "5b0dd365-7a39-4923-b716-4d5121c1569f" (UID: "5b0dd365-7a39-4923-b716-4d5121c1569f"). InnerVolumeSpecName "kube-api-access-b449h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:53:39.574343 master-2 kubenswrapper[4762]: I1014 13:53:39.574244 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b0dd365-7a39-4923-b716-4d5121c1569f" (UID: "5b0dd365-7a39-4923-b716-4d5121c1569f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:53:39.602726 master-2 kubenswrapper[4762]: I1014 13:53:39.602649 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b449h\" (UniqueName: \"kubernetes.io/projected/5b0dd365-7a39-4923-b716-4d5121c1569f-kube-api-access-b449h\") on node \"master-2\" DevicePath \"\"" Oct 14 13:53:39.602726 master-2 kubenswrapper[4762]: I1014 13:53:39.602695 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0dd365-7a39-4923-b716-4d5121c1569f-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:53:39.775126 master-2 kubenswrapper[4762]: I1014 13:53:39.774883 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:39.790640 master-2 kubenswrapper[4762]: I1014 13:53:39.790554 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jdhvr"] Oct 14 13:53:41.564970 master-2 kubenswrapper[4762]: I1014 13:53:41.564747 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" path="/var/lib/kubelet/pods/5b0dd365-7a39-4923-b716-4d5121c1569f/volumes" Oct 14 13:54:16.149326 master-2 kubenswrapper[4762]: I1014 13:54:16.149145 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: E1014 13:54:16.149840 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="registry-server" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: I1014 13:54:16.149860 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="registry-server" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: E1014 13:54:16.149891 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="extract-content" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: I1014 13:54:16.149897 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="extract-content" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: E1014 13:54:16.149919 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="extract-utilities" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: I1014 13:54:16.149930 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="extract-utilities" Oct 14 13:54:16.150610 master-2 kubenswrapper[4762]: I1014 13:54:16.150179 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0dd365-7a39-4923-b716-4d5121c1569f" containerName="registry-server" Oct 14 13:54:16.152044 master-2 kubenswrapper[4762]: I1014 13:54:16.151987 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.154427 master-2 kubenswrapper[4762]: I1014 13:54:16.154380 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.154559 master-2 kubenswrapper[4762]: I1014 13:54:16.154485 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g46\" (UniqueName: \"kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.154635 master-2 kubenswrapper[4762]: I1014 13:54:16.154620 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.159041 master-2 kubenswrapper[4762]: I1014 13:54:16.158958 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:16.259995 master-2 kubenswrapper[4762]: I1014 13:54:16.259893 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.260448 master-2 kubenswrapper[4762]: I1014 13:54:16.260042 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g46\" (UniqueName: \"kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.260448 master-2 kubenswrapper[4762]: I1014 13:54:16.260383 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.260847 master-2 kubenswrapper[4762]: I1014 13:54:16.260784 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.261290 master-2 kubenswrapper[4762]: I1014 13:54:16.261189 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.295338 master-2 kubenswrapper[4762]: I1014 13:54:16.294809 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g46\" (UniqueName: \"kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46\") pod \"redhat-marketplace-ssf75\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.487483 master-2 kubenswrapper[4762]: I1014 13:54:16.487423 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:16.934009 master-2 kubenswrapper[4762]: I1014 13:54:16.933907 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:17.810420 master-2 kubenswrapper[4762]: I1014 13:54:17.810303 4762 generic.go:334] "Generic (PLEG): container finished" podID="392fc431-4790-499c-b384-d1d007e45e6a" containerID="31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb" exitCode=0 Oct 14 13:54:17.810420 master-2 kubenswrapper[4762]: I1014 13:54:17.810356 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerDied","Data":"31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb"} Oct 14 13:54:17.810420 master-2 kubenswrapper[4762]: I1014 13:54:17.810388 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerStarted","Data":"00ab34b0ef20603d3b0341d6ef5a1750a483c87cf0c7fb9af65469d3f9e2997a"} Oct 14 13:54:18.820177 master-2 kubenswrapper[4762]: I1014 13:54:18.819949 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerStarted","Data":"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9"} Oct 14 13:54:19.832949 master-2 kubenswrapper[4762]: I1014 13:54:19.832846 4762 generic.go:334] "Generic (PLEG): container finished" podID="392fc431-4790-499c-b384-d1d007e45e6a" containerID="e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9" exitCode=0 Oct 14 13:54:19.833860 master-2 kubenswrapper[4762]: I1014 13:54:19.832956 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerDied","Data":"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9"} Oct 14 13:54:20.844256 master-2 kubenswrapper[4762]: I1014 13:54:20.844141 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerStarted","Data":"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb"} Oct 14 13:54:21.588878 master-2 kubenswrapper[4762]: I1014 13:54:21.588769 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ssf75" podStartSLOduration=3.110694989 podStartE2EDuration="5.58874742s" podCreationTimestamp="2025-10-14 13:54:16 +0000 UTC" firstStartedPulling="2025-10-14 13:54:17.814333622 +0000 UTC m=+2887.058492791" lastFinishedPulling="2025-10-14 13:54:20.292386023 +0000 UTC m=+2889.536545222" observedRunningTime="2025-10-14 13:54:21.581436909 +0000 UTC m=+2890.825596088" watchObservedRunningTime="2025-10-14 13:54:21.58874742 +0000 UTC m=+2890.832906579" Oct 14 13:54:26.488349 master-2 kubenswrapper[4762]: I1014 13:54:26.488274 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:26.489554 master-2 kubenswrapper[4762]: I1014 13:54:26.489351 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:26.557139 master-2 kubenswrapper[4762]: I1014 13:54:26.557044 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:26.967982 master-2 kubenswrapper[4762]: I1014 13:54:26.967902 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:27.046721 master-2 kubenswrapper[4762]: I1014 13:54:27.046608 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:28.924043 master-2 kubenswrapper[4762]: I1014 13:54:28.923920 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ssf75" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="registry-server" containerID="cri-o://2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb" gracePeriod=2 Oct 14 13:54:29.530584 master-2 kubenswrapper[4762]: I1014 13:54:29.530506 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:29.675669 master-2 kubenswrapper[4762]: I1014 13:54:29.675296 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content\") pod \"392fc431-4790-499c-b384-d1d007e45e6a\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " Oct 14 13:54:29.675669 master-2 kubenswrapper[4762]: I1014 13:54:29.675450 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities\") pod \"392fc431-4790-499c-b384-d1d007e45e6a\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " Oct 14 13:54:29.675669 master-2 kubenswrapper[4762]: I1014 13:54:29.675482 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25g46\" (UniqueName: \"kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46\") pod \"392fc431-4790-499c-b384-d1d007e45e6a\" (UID: \"392fc431-4790-499c-b384-d1d007e45e6a\") " Oct 14 13:54:29.678181 master-2 kubenswrapper[4762]: I1014 13:54:29.676552 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities" (OuterVolumeSpecName: "utilities") pod "392fc431-4790-499c-b384-d1d007e45e6a" (UID: "392fc431-4790-499c-b384-d1d007e45e6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:54:29.678181 master-2 kubenswrapper[4762]: I1014 13:54:29.677432 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:54:29.679262 master-2 kubenswrapper[4762]: I1014 13:54:29.678814 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46" (OuterVolumeSpecName: "kube-api-access-25g46") pod "392fc431-4790-499c-b384-d1d007e45e6a" (UID: "392fc431-4790-499c-b384-d1d007e45e6a"). InnerVolumeSpecName "kube-api-access-25g46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:54:29.689345 master-2 kubenswrapper[4762]: I1014 13:54:29.689281 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "392fc431-4790-499c-b384-d1d007e45e6a" (UID: "392fc431-4790-499c-b384-d1d007e45e6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:54:29.779187 master-2 kubenswrapper[4762]: I1014 13:54:29.779101 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25g46\" (UniqueName: \"kubernetes.io/projected/392fc431-4790-499c-b384-d1d007e45e6a-kube-api-access-25g46\") on node \"master-2\" DevicePath \"\"" Oct 14 13:54:29.779187 master-2 kubenswrapper[4762]: I1014 13:54:29.779180 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392fc431-4790-499c-b384-d1d007e45e6a-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:54:29.934958 master-2 kubenswrapper[4762]: I1014 13:54:29.934877 4762 generic.go:334] "Generic (PLEG): container finished" podID="392fc431-4790-499c-b384-d1d007e45e6a" containerID="2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb" exitCode=0 Oct 14 13:54:29.934958 master-2 kubenswrapper[4762]: I1014 13:54:29.934929 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerDied","Data":"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb"} Oct 14 13:54:29.935897 master-2 kubenswrapper[4762]: I1014 13:54:29.934986 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssf75" Oct 14 13:54:29.935897 master-2 kubenswrapper[4762]: I1014 13:54:29.935039 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssf75" event={"ID":"392fc431-4790-499c-b384-d1d007e45e6a","Type":"ContainerDied","Data":"00ab34b0ef20603d3b0341d6ef5a1750a483c87cf0c7fb9af65469d3f9e2997a"} Oct 14 13:54:29.935897 master-2 kubenswrapper[4762]: I1014 13:54:29.935069 4762 scope.go:117] "RemoveContainer" containerID="2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb" Oct 14 13:54:29.962036 master-2 kubenswrapper[4762]: I1014 13:54:29.961979 4762 scope.go:117] "RemoveContainer" containerID="e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9" Oct 14 13:54:29.987883 master-2 kubenswrapper[4762]: I1014 13:54:29.987808 4762 scope.go:117] "RemoveContainer" containerID="31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb" Oct 14 13:54:29.998525 master-2 kubenswrapper[4762]: I1014 13:54:29.998414 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:30.006877 master-2 kubenswrapper[4762]: I1014 13:54:30.006838 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssf75"] Oct 14 13:54:30.025547 master-2 kubenswrapper[4762]: I1014 13:54:30.025520 4762 scope.go:117] "RemoveContainer" containerID="2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb" Oct 14 13:54:30.026491 master-2 kubenswrapper[4762]: E1014 13:54:30.026426 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb\": container with ID starting with 2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb not found: ID does not exist" containerID="2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb" Oct 14 13:54:30.026562 master-2 kubenswrapper[4762]: I1014 13:54:30.026515 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb"} err="failed to get container status \"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb\": rpc error: code = NotFound desc = could not find container \"2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb\": container with ID starting with 2f53bcda7ee62e63590f2ffdb50a0462d737ff6a8802555e4811d186390993fb not found: ID does not exist" Oct 14 13:54:30.026619 master-2 kubenswrapper[4762]: I1014 13:54:30.026561 4762 scope.go:117] "RemoveContainer" containerID="e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9" Oct 14 13:54:30.027139 master-2 kubenswrapper[4762]: E1014 13:54:30.027104 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9\": container with ID starting with e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9 not found: ID does not exist" containerID="e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9" Oct 14 13:54:30.027363 master-2 kubenswrapper[4762]: I1014 13:54:30.027146 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9"} err="failed to get container status \"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9\": rpc error: code = NotFound desc = could not find container \"e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9\": container with ID starting with e7bfce7d4c9f703ee373ad20f6e5383fb72d637531bd4e2ab7544110081ea1b9 not found: ID does not exist" Oct 14 13:54:30.027363 master-2 kubenswrapper[4762]: I1014 13:54:30.027358 4762 scope.go:117] "RemoveContainer" containerID="31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb" Oct 14 13:54:30.027922 master-2 kubenswrapper[4762]: E1014 13:54:30.027894 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb\": container with ID starting with 31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb not found: ID does not exist" containerID="31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb" Oct 14 13:54:30.028045 master-2 kubenswrapper[4762]: I1014 13:54:30.028020 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb"} err="failed to get container status \"31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb\": rpc error: code = NotFound desc = could not find container \"31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb\": container with ID starting with 31d399248e3f9d6f98ee94e84b8f2b5a4747f3bc3fcaf6251a90726ace8a27cb not found: ID does not exist" Oct 14 13:54:31.557293 master-2 kubenswrapper[4762]: I1014 13:54:31.557251 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fc431-4790-499c-b384-d1d007e45e6a" path="/var/lib/kubelet/pods/392fc431-4790-499c-b384-d1d007e45e6a/volumes" Oct 14 13:54:54.597258 master-2 kubenswrapper[4762]: I1014 13:54:54.597173 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: E1014 13:54:54.597596 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="extract-content" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: I1014 13:54:54.597614 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="extract-content" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: E1014 13:54:54.597646 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="registry-server" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: I1014 13:54:54.597653 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="registry-server" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: E1014 13:54:54.597665 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="extract-utilities" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: I1014 13:54:54.597676 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="extract-utilities" Oct 14 13:54:54.598093 master-2 kubenswrapper[4762]: I1014 13:54:54.597824 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fc431-4790-499c-b384-d1d007e45e6a" containerName="registry-server" Oct 14 13:54:54.599327 master-2 kubenswrapper[4762]: I1014 13:54:54.599301 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.618759 master-2 kubenswrapper[4762]: I1014 13:54:54.618699 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:54:54.652506 master-2 kubenswrapper[4762]: I1014 13:54:54.651887 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.652506 master-2 kubenswrapper[4762]: I1014 13:54:54.652134 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrgb\" (UniqueName: \"kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.652962 master-2 kubenswrapper[4762]: I1014 13:54:54.652541 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.755543 master-2 kubenswrapper[4762]: I1014 13:54:54.755396 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.755543 master-2 kubenswrapper[4762]: I1014 13:54:54.755518 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrgb\" (UniqueName: \"kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.755940 master-2 kubenswrapper[4762]: I1014 13:54:54.755575 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.756272 master-2 kubenswrapper[4762]: I1014 13:54:54.756218 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.756272 master-2 kubenswrapper[4762]: I1014 13:54:54.756258 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.780067 master-2 kubenswrapper[4762]: I1014 13:54:54.779973 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrgb\" (UniqueName: \"kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb\") pod \"redhat-operators-vqdj2\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:54.929839 master-2 kubenswrapper[4762]: I1014 13:54:54.929776 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:54:55.361692 master-2 kubenswrapper[4762]: I1014 13:54:55.361638 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:54:55.366257 master-2 kubenswrapper[4762]: W1014 13:54:55.365915 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a04f84_57dc_44ea_bfd1_ccdf7142a983.slice/crio-f32b87e1347864c3094acd9c17b4009d89deefee70ba3996c87bb33418e3ba75 WatchSource:0}: Error finding container f32b87e1347864c3094acd9c17b4009d89deefee70ba3996c87bb33418e3ba75: Status 404 returned error can't find the container with id f32b87e1347864c3094acd9c17b4009d89deefee70ba3996c87bb33418e3ba75 Oct 14 13:54:56.189829 master-2 kubenswrapper[4762]: I1014 13:54:56.189740 4762 generic.go:334] "Generic (PLEG): container finished" podID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerID="0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82" exitCode=0 Oct 14 13:54:56.190655 master-2 kubenswrapper[4762]: I1014 13:54:56.189801 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerDied","Data":"0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82"} Oct 14 13:54:56.190655 master-2 kubenswrapper[4762]: I1014 13:54:56.189883 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerStarted","Data":"f32b87e1347864c3094acd9c17b4009d89deefee70ba3996c87bb33418e3ba75"} Oct 14 13:54:58.223060 master-2 kubenswrapper[4762]: I1014 13:54:58.222991 4762 generic.go:334] "Generic (PLEG): container finished" podID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerID="db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da" exitCode=0 Oct 14 13:54:58.223060 master-2 kubenswrapper[4762]: I1014 13:54:58.223046 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerDied","Data":"db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da"} Oct 14 13:55:03.283253 master-2 kubenswrapper[4762]: I1014 13:55:03.283186 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerStarted","Data":"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d"} Oct 14 13:55:03.330548 master-2 kubenswrapper[4762]: I1014 13:55:03.330466 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vqdj2" podStartSLOduration=2.680402598 podStartE2EDuration="9.33043309s" podCreationTimestamp="2025-10-14 13:54:54 +0000 UTC" firstStartedPulling="2025-10-14 13:54:56.191728039 +0000 UTC m=+2925.435887198" lastFinishedPulling="2025-10-14 13:55:02.841758531 +0000 UTC m=+2932.085917690" observedRunningTime="2025-10-14 13:55:03.317607747 +0000 UTC m=+2932.561766956" watchObservedRunningTime="2025-10-14 13:55:03.33043309 +0000 UTC m=+2932.574592249" Oct 14 13:55:04.930132 master-2 kubenswrapper[4762]: I1014 13:55:04.930062 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:04.931270 master-2 kubenswrapper[4762]: I1014 13:55:04.930318 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:05.986729 master-2 kubenswrapper[4762]: I1014 13:55:05.986626 4762 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vqdj2" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="registry-server" probeResult="failure" output=< Oct 14 13:55:05.986729 master-2 kubenswrapper[4762]: timeout: failed to connect service ":50051" within 1s Oct 14 13:55:05.986729 master-2 kubenswrapper[4762]: > Oct 14 13:55:15.015215 master-2 kubenswrapper[4762]: I1014 13:55:15.015029 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:15.082942 master-2 kubenswrapper[4762]: I1014 13:55:15.082855 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:15.572235 master-2 kubenswrapper[4762]: I1014 13:55:15.572018 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:55:16.422421 master-2 kubenswrapper[4762]: I1014 13:55:16.422306 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vqdj2" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="registry-server" containerID="cri-o://a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d" gracePeriod=2 Oct 14 13:55:17.048746 master-2 kubenswrapper[4762]: I1014 13:55:17.048664 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:17.127709 master-2 kubenswrapper[4762]: I1014 13:55:17.127623 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzrgb\" (UniqueName: \"kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb\") pod \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " Oct 14 13:55:17.127949 master-2 kubenswrapper[4762]: I1014 13:55:17.127901 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities\") pod \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " Oct 14 13:55:17.128116 master-2 kubenswrapper[4762]: I1014 13:55:17.128063 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content\") pod \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\" (UID: \"96a04f84-57dc-44ea-bfd1-ccdf7142a983\") " Oct 14 13:55:17.129977 master-2 kubenswrapper[4762]: I1014 13:55:17.129896 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities" (OuterVolumeSpecName: "utilities") pod "96a04f84-57dc-44ea-bfd1-ccdf7142a983" (UID: "96a04f84-57dc-44ea-bfd1-ccdf7142a983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:55:17.132603 master-2 kubenswrapper[4762]: I1014 13:55:17.132524 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb" (OuterVolumeSpecName: "kube-api-access-vzrgb") pod "96a04f84-57dc-44ea-bfd1-ccdf7142a983" (UID: "96a04f84-57dc-44ea-bfd1-ccdf7142a983"). InnerVolumeSpecName "kube-api-access-vzrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:55:17.231724 master-2 kubenswrapper[4762]: I1014 13:55:17.231655 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzrgb\" (UniqueName: \"kubernetes.io/projected/96a04f84-57dc-44ea-bfd1-ccdf7142a983-kube-api-access-vzrgb\") on node \"master-2\" DevicePath \"\"" Oct 14 13:55:17.231724 master-2 kubenswrapper[4762]: I1014 13:55:17.231718 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:55:17.292057 master-2 kubenswrapper[4762]: I1014 13:55:17.291975 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96a04f84-57dc-44ea-bfd1-ccdf7142a983" (UID: "96a04f84-57dc-44ea-bfd1-ccdf7142a983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:55:17.334657 master-2 kubenswrapper[4762]: I1014 13:55:17.334572 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a04f84-57dc-44ea-bfd1-ccdf7142a983-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:55:17.438532 master-2 kubenswrapper[4762]: I1014 13:55:17.438486 4762 generic.go:334] "Generic (PLEG): container finished" podID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerID="a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d" exitCode=0 Oct 14 13:55:17.439115 master-2 kubenswrapper[4762]: I1014 13:55:17.438572 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqdj2" Oct 14 13:55:17.439356 master-2 kubenswrapper[4762]: I1014 13:55:17.438599 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerDied","Data":"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d"} Oct 14 13:55:17.439420 master-2 kubenswrapper[4762]: I1014 13:55:17.439386 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqdj2" event={"ID":"96a04f84-57dc-44ea-bfd1-ccdf7142a983","Type":"ContainerDied","Data":"f32b87e1347864c3094acd9c17b4009d89deefee70ba3996c87bb33418e3ba75"} Oct 14 13:55:17.439456 master-2 kubenswrapper[4762]: I1014 13:55:17.439427 4762 scope.go:117] "RemoveContainer" containerID="a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d" Oct 14 13:55:17.464026 master-2 kubenswrapper[4762]: I1014 13:55:17.463999 4762 scope.go:117] "RemoveContainer" containerID="db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da" Oct 14 13:55:17.487826 master-2 kubenswrapper[4762]: I1014 13:55:17.487781 4762 scope.go:117] "RemoveContainer" containerID="0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82" Oct 14 13:55:17.529119 master-2 kubenswrapper[4762]: I1014 13:55:17.529085 4762 scope.go:117] "RemoveContainer" containerID="a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d" Oct 14 13:55:17.529774 master-2 kubenswrapper[4762]: E1014 13:55:17.529749 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d\": container with ID starting with a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d not found: ID does not exist" containerID="a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d" Oct 14 13:55:17.529878 master-2 kubenswrapper[4762]: I1014 13:55:17.529848 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d"} err="failed to get container status \"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d\": rpc error: code = NotFound desc = could not find container \"a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d\": container with ID starting with a759ed51f82785bff56dcc45e5f98a4bf51a1f2fd916b7783f1aa7d26cd7750d not found: ID does not exist" Oct 14 13:55:17.529957 master-2 kubenswrapper[4762]: I1014 13:55:17.529941 4762 scope.go:117] "RemoveContainer" containerID="db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da" Oct 14 13:55:17.530555 master-2 kubenswrapper[4762]: E1014 13:55:17.530537 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da\": container with ID starting with db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da not found: ID does not exist" containerID="db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da" Oct 14 13:55:17.530663 master-2 kubenswrapper[4762]: I1014 13:55:17.530643 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da"} err="failed to get container status \"db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da\": rpc error: code = NotFound desc = could not find container \"db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da\": container with ID starting with db59f153034b0c50481c40f8f72498b91005ea0c2e0d084e507e60ffccc636da not found: ID does not exist" Oct 14 13:55:17.530735 master-2 kubenswrapper[4762]: I1014 13:55:17.530724 4762 scope.go:117] "RemoveContainer" containerID="0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82" Oct 14 13:55:17.531424 master-2 kubenswrapper[4762]: E1014 13:55:17.531408 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82\": container with ID starting with 0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82 not found: ID does not exist" containerID="0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82" Oct 14 13:55:17.531510 master-2 kubenswrapper[4762]: I1014 13:55:17.531495 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82"} err="failed to get container status \"0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82\": rpc error: code = NotFound desc = could not find container \"0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82\": container with ID starting with 0e11716b92bccdac720ace6c0aa04d0db0e5bee2c435d1504c03cfea3b3e4c82 not found: ID does not exist" Oct 14 13:55:17.568002 master-2 kubenswrapper[4762]: I1014 13:55:17.567891 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:55:17.709096 master-2 kubenswrapper[4762]: I1014 13:55:17.708908 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vqdj2"] Oct 14 13:55:19.565298 master-2 kubenswrapper[4762]: I1014 13:55:19.565212 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" path="/var/lib/kubelet/pods/96a04f84-57dc-44ea-bfd1-ccdf7142a983/volumes" Oct 14 13:57:11.353950 master-2 kubenswrapper[4762]: I1014 13:57:11.353837 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: E1014 13:57:11.354447 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="extract-content" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: I1014 13:57:11.354467 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="extract-content" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: E1014 13:57:11.354487 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="registry-server" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: I1014 13:57:11.354495 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="registry-server" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: E1014 13:57:11.354526 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="extract-utilities" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: I1014 13:57:11.354537 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="extract-utilities" Oct 14 13:57:11.355552 master-2 kubenswrapper[4762]: I1014 13:57:11.354746 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a04f84-57dc-44ea-bfd1-ccdf7142a983" containerName="registry-server" Oct 14 13:57:11.356774 master-2 kubenswrapper[4762]: I1014 13:57:11.356666 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.412471 master-2 kubenswrapper[4762]: I1014 13:57:11.412249 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m46j\" (UniqueName: \"kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.412471 master-2 kubenswrapper[4762]: I1014 13:57:11.412402 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.413004 master-2 kubenswrapper[4762]: I1014 13:57:11.412493 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.420348 master-2 kubenswrapper[4762]: I1014 13:57:11.419619 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:11.514057 master-2 kubenswrapper[4762]: I1014 13:57:11.513978 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.514379 master-2 kubenswrapper[4762]: I1014 13:57:11.514099 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m46j\" (UniqueName: \"kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.514379 master-2 kubenswrapper[4762]: I1014 13:57:11.514210 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.514768 master-2 kubenswrapper[4762]: I1014 13:57:11.514731 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.515055 master-2 kubenswrapper[4762]: I1014 13:57:11.515024 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.540194 master-2 kubenswrapper[4762]: I1014 13:57:11.534054 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m46j\" (UniqueName: \"kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j\") pod \"certified-operators-qljc7\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:11.728181 master-2 kubenswrapper[4762]: I1014 13:57:11.727531 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:12.184212 master-2 kubenswrapper[4762]: I1014 13:57:12.182967 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:12.186365 master-2 kubenswrapper[4762]: W1014 13:57:12.185924 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49727a24_f6dc_4f08_a192_a126571e397d.slice/crio-18e96808953fd418859e477ff3f90f496b7b5c9cd9f23243ed390f9dc08de8de WatchSource:0}: Error finding container 18e96808953fd418859e477ff3f90f496b7b5c9cd9f23243ed390f9dc08de8de: Status 404 returned error can't find the container with id 18e96808953fd418859e477ff3f90f496b7b5c9cd9f23243ed390f9dc08de8de Oct 14 13:57:12.654563 master-2 kubenswrapper[4762]: I1014 13:57:12.654382 4762 generic.go:334] "Generic (PLEG): container finished" podID="49727a24-f6dc-4f08-a192-a126571e397d" containerID="d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924" exitCode=0 Oct 14 13:57:12.654563 master-2 kubenswrapper[4762]: I1014 13:57:12.654448 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerDied","Data":"d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924"} Oct 14 13:57:12.654563 master-2 kubenswrapper[4762]: I1014 13:57:12.654513 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerStarted","Data":"18e96808953fd418859e477ff3f90f496b7b5c9cd9f23243ed390f9dc08de8de"} Oct 14 13:57:14.673471 master-2 kubenswrapper[4762]: I1014 13:57:14.673418 4762 generic.go:334] "Generic (PLEG): container finished" podID="49727a24-f6dc-4f08-a192-a126571e397d" containerID="45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67" exitCode=0 Oct 14 13:57:14.673983 master-2 kubenswrapper[4762]: I1014 13:57:14.673481 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerDied","Data":"45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67"} Oct 14 13:57:15.686807 master-2 kubenswrapper[4762]: I1014 13:57:15.686706 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerStarted","Data":"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c"} Oct 14 13:57:15.716330 master-2 kubenswrapper[4762]: I1014 13:57:15.716231 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qljc7" podStartSLOduration=2.254938806 podStartE2EDuration="4.716196058s" podCreationTimestamp="2025-10-14 13:57:11 +0000 UTC" firstStartedPulling="2025-10-14 13:57:12.656813255 +0000 UTC m=+3061.900972424" lastFinishedPulling="2025-10-14 13:57:15.118070517 +0000 UTC m=+3064.362229676" observedRunningTime="2025-10-14 13:57:15.714798652 +0000 UTC m=+3064.958957821" watchObservedRunningTime="2025-10-14 13:57:15.716196058 +0000 UTC m=+3064.960355257" Oct 14 13:57:21.729583 master-2 kubenswrapper[4762]: I1014 13:57:21.729438 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:21.729583 master-2 kubenswrapper[4762]: I1014 13:57:21.729505 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:21.773664 master-2 kubenswrapper[4762]: I1014 13:57:21.773584 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:21.830041 master-2 kubenswrapper[4762]: I1014 13:57:21.829957 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:22.032000 master-2 kubenswrapper[4762]: I1014 13:57:22.031832 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:23.762290 master-2 kubenswrapper[4762]: I1014 13:57:23.762145 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qljc7" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="registry-server" containerID="cri-o://fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c" gracePeriod=2 Oct 14 13:57:24.369911 master-2 kubenswrapper[4762]: I1014 13:57:24.369843 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:24.513819 master-2 kubenswrapper[4762]: I1014 13:57:24.513755 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content\") pod \"49727a24-f6dc-4f08-a192-a126571e397d\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " Oct 14 13:57:24.514095 master-2 kubenswrapper[4762]: I1014 13:57:24.513970 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities\") pod \"49727a24-f6dc-4f08-a192-a126571e397d\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " Oct 14 13:57:24.514227 master-2 kubenswrapper[4762]: I1014 13:57:24.514153 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m46j\" (UniqueName: \"kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j\") pod \"49727a24-f6dc-4f08-a192-a126571e397d\" (UID: \"49727a24-f6dc-4f08-a192-a126571e397d\") " Oct 14 13:57:24.515465 master-2 kubenswrapper[4762]: I1014 13:57:24.515390 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities" (OuterVolumeSpecName: "utilities") pod "49727a24-f6dc-4f08-a192-a126571e397d" (UID: "49727a24-f6dc-4f08-a192-a126571e397d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:57:24.519461 master-2 kubenswrapper[4762]: I1014 13:57:24.519405 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j" (OuterVolumeSpecName: "kube-api-access-2m46j") pod "49727a24-f6dc-4f08-a192-a126571e397d" (UID: "49727a24-f6dc-4f08-a192-a126571e397d"). InnerVolumeSpecName "kube-api-access-2m46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 13:57:24.572457 master-2 kubenswrapper[4762]: I1014 13:57:24.572353 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49727a24-f6dc-4f08-a192-a126571e397d" (UID: "49727a24-f6dc-4f08-a192-a126571e397d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 13:57:24.617676 master-2 kubenswrapper[4762]: I1014 13:57:24.617580 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 13:57:24.617676 master-2 kubenswrapper[4762]: I1014 13:57:24.617663 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m46j\" (UniqueName: \"kubernetes.io/projected/49727a24-f6dc-4f08-a192-a126571e397d-kube-api-access-2m46j\") on node \"master-2\" DevicePath \"\"" Oct 14 13:57:24.617676 master-2 kubenswrapper[4762]: I1014 13:57:24.617711 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727a24-f6dc-4f08-a192-a126571e397d-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 13:57:24.774317 master-2 kubenswrapper[4762]: I1014 13:57:24.774245 4762 generic.go:334] "Generic (PLEG): container finished" podID="49727a24-f6dc-4f08-a192-a126571e397d" containerID="fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c" exitCode=0 Oct 14 13:57:24.774317 master-2 kubenswrapper[4762]: I1014 13:57:24.774296 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerDied","Data":"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c"} Oct 14 13:57:24.774925 master-2 kubenswrapper[4762]: I1014 13:57:24.774374 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qljc7" event={"ID":"49727a24-f6dc-4f08-a192-a126571e397d","Type":"ContainerDied","Data":"18e96808953fd418859e477ff3f90f496b7b5c9cd9f23243ed390f9dc08de8de"} Oct 14 13:57:24.774925 master-2 kubenswrapper[4762]: I1014 13:57:24.774391 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qljc7" Oct 14 13:57:24.774925 master-2 kubenswrapper[4762]: I1014 13:57:24.774440 4762 scope.go:117] "RemoveContainer" containerID="fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c" Oct 14 13:57:24.795120 master-2 kubenswrapper[4762]: I1014 13:57:24.795069 4762 scope.go:117] "RemoveContainer" containerID="45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67" Oct 14 13:57:24.823804 master-2 kubenswrapper[4762]: I1014 13:57:24.823766 4762 scope.go:117] "RemoveContainer" containerID="d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924" Oct 14 13:57:24.831964 master-2 kubenswrapper[4762]: I1014 13:57:24.831896 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:24.837202 master-2 kubenswrapper[4762]: I1014 13:57:24.837103 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qljc7"] Oct 14 13:57:24.855999 master-2 kubenswrapper[4762]: I1014 13:57:24.855947 4762 scope.go:117] "RemoveContainer" containerID="fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c" Oct 14 13:57:24.856555 master-2 kubenswrapper[4762]: E1014 13:57:24.856486 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c\": container with ID starting with fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c not found: ID does not exist" containerID="fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c" Oct 14 13:57:24.856555 master-2 kubenswrapper[4762]: I1014 13:57:24.856543 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c"} err="failed to get container status \"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c\": rpc error: code = NotFound desc = could not find container \"fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c\": container with ID starting with fcaad291def7a504c2b0db40cac3e9036bf2c854e5facc21f571a53506a0bc5c not found: ID does not exist" Oct 14 13:57:24.856814 master-2 kubenswrapper[4762]: I1014 13:57:24.856571 4762 scope.go:117] "RemoveContainer" containerID="45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67" Oct 14 13:57:24.857023 master-2 kubenswrapper[4762]: E1014 13:57:24.856970 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67\": container with ID starting with 45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67 not found: ID does not exist" containerID="45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67" Oct 14 13:57:24.857023 master-2 kubenswrapper[4762]: I1014 13:57:24.857007 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67"} err="failed to get container status \"45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67\": rpc error: code = NotFound desc = could not find container \"45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67\": container with ID starting with 45f102c753e3d3c8820c00084e959f3ff28c755e659a7b175b0a1044979f4e67 not found: ID does not exist" Oct 14 13:57:24.857305 master-2 kubenswrapper[4762]: I1014 13:57:24.857034 4762 scope.go:117] "RemoveContainer" containerID="d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924" Oct 14 13:57:24.857403 master-2 kubenswrapper[4762]: E1014 13:57:24.857307 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924\": container with ID starting with d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924 not found: ID does not exist" containerID="d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924" Oct 14 13:57:24.857403 master-2 kubenswrapper[4762]: I1014 13:57:24.857329 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924"} err="failed to get container status \"d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924\": rpc error: code = NotFound desc = could not find container \"d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924\": container with ID starting with d75dddb2f4d51a43654587394894e054663ce245145ee85cc72edaadeaa2e924 not found: ID does not exist" Oct 14 13:57:25.559102 master-2 kubenswrapper[4762]: I1014 13:57:25.558947 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49727a24-f6dc-4f08-a192-a126571e397d" path="/var/lib/kubelet/pods/49727a24-f6dc-4f08-a192-a126571e397d/volumes" Oct 14 14:00:04.248222 master-2 kubenswrapper[4762]: I1014 14:00:04.248099 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5"] Oct 14 14:00:04.259344 master-2 kubenswrapper[4762]: I1014 14:00:04.259258 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29340795-t5kx5"] Oct 14 14:00:05.562822 master-2 kubenswrapper[4762]: I1014 14:00:05.562754 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29941561-0dd2-4fbc-a503-f42b2527e405" path="/var/lib/kubelet/pods/29941561-0dd2-4fbc-a503-f42b2527e405/volumes" Oct 14 14:00:25.354274 master-2 kubenswrapper[4762]: I1014 14:00:25.354142 4762 scope.go:117] "RemoveContainer" containerID="fe41a9518ad517a60f5ed8aa178d9fa7552cfd0101e2f6a2321d58ab03ac48c4" Oct 14 14:04:13.160615 master-2 kubenswrapper[4762]: I1014 14:04:13.160410 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zqxwl/must-gather-hp6tz"] Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: E1014 14:04:13.161129 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="extract-utilities" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: I1014 14:04:13.161182 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="extract-utilities" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: E1014 14:04:13.161234 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="registry-server" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: I1014 14:04:13.161252 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="registry-server" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: E1014 14:04:13.161287 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="extract-content" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: I1014 14:04:13.161305 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="extract-content" Oct 14 14:04:13.161770 master-2 kubenswrapper[4762]: I1014 14:04:13.161678 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="49727a24-f6dc-4f08-a192-a126571e397d" containerName="registry-server" Oct 14 14:04:13.164038 master-2 kubenswrapper[4762]: I1014 14:04:13.163871 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.171971 master-2 kubenswrapper[4762]: I1014 14:04:13.167862 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zqxwl"/"kube-root-ca.crt" Oct 14 14:04:13.171971 master-2 kubenswrapper[4762]: I1014 14:04:13.167892 4762 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zqxwl"/"openshift-service-ca.crt" Oct 14 14:04:13.193624 master-2 kubenswrapper[4762]: I1014 14:04:13.193533 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zqxwl/must-gather-hp6tz"] Oct 14 14:04:13.201619 master-2 kubenswrapper[4762]: I1014 14:04:13.201530 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvtvm\" (UniqueName: \"kubernetes.io/projected/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-kube-api-access-kvtvm\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.201803 master-2 kubenswrapper[4762]: I1014 14:04:13.201618 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-must-gather-output\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.303310 master-2 kubenswrapper[4762]: I1014 14:04:13.303224 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvtvm\" (UniqueName: \"kubernetes.io/projected/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-kube-api-access-kvtvm\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.303310 master-2 kubenswrapper[4762]: I1014 14:04:13.303285 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-must-gather-output\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.303976 master-2 kubenswrapper[4762]: I1014 14:04:13.303933 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-must-gather-output\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.330471 master-2 kubenswrapper[4762]: I1014 14:04:13.330409 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvtvm\" (UniqueName: \"kubernetes.io/projected/4ecac67b-ee20-4532-9f9e-6aca39c7ed9c-kube-api-access-kvtvm\") pod \"must-gather-hp6tz\" (UID: \"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c\") " pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.490901 master-2 kubenswrapper[4762]: I1014 14:04:13.490847 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" Oct 14 14:04:13.984225 master-2 kubenswrapper[4762]: I1014 14:04:13.984181 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zqxwl/must-gather-hp6tz"] Oct 14 14:04:14.000111 master-2 kubenswrapper[4762]: W1014 14:04:14.000055 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ecac67b_ee20_4532_9f9e_6aca39c7ed9c.slice/crio-5beb0cb390bc27827b247a4c684d2d9c062c1ebf8217f70c8fdf4cd7e8804cbe WatchSource:0}: Error finding container 5beb0cb390bc27827b247a4c684d2d9c062c1ebf8217f70c8fdf4cd7e8804cbe: Status 404 returned error can't find the container with id 5beb0cb390bc27827b247a4c684d2d9c062c1ebf8217f70c8fdf4cd7e8804cbe Oct 14 14:04:14.002990 master-2 kubenswrapper[4762]: I1014 14:04:14.002736 4762 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 14 14:04:14.879588 master-2 kubenswrapper[4762]: I1014 14:04:14.879495 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" event={"ID":"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c","Type":"ContainerStarted","Data":"5beb0cb390bc27827b247a4c684d2d9c062c1ebf8217f70c8fdf4cd7e8804cbe"} Oct 14 14:04:18.226286 master-2 kubenswrapper[4762]: I1014 14:04:18.226206 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-55bd67947c-872k9_d0bf2b14-2719-4b1b-a661-fbf4d27c05dc/cluster-version-operator/0.log" Oct 14 14:04:19.957190 master-2 kubenswrapper[4762]: I1014 14:04:19.944308 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" event={"ID":"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c","Type":"ContainerStarted","Data":"3e279d054e48ca606c7273f2cedfc29769ee6578acee00e4b54702f4fef024a4"} Oct 14 14:04:19.957190 master-2 kubenswrapper[4762]: I1014 14:04:19.944375 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" event={"ID":"4ecac67b-ee20-4532-9f9e-6aca39c7ed9c","Type":"ContainerStarted","Data":"8b65b4b767e4e4f0012c6c38e5686d6fc56c5ef3037b46bec5f6a291522e643e"} Oct 14 14:04:19.984185 master-2 kubenswrapper[4762]: I1014 14:04:19.984068 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zqxwl/must-gather-hp6tz" podStartSLOduration=2.031880784 podStartE2EDuration="6.984046599s" podCreationTimestamp="2025-10-14 14:04:13 +0000 UTC" firstStartedPulling="2025-10-14 14:04:14.00270473 +0000 UTC m=+3483.246863889" lastFinishedPulling="2025-10-14 14:04:18.954870535 +0000 UTC m=+3488.199029704" observedRunningTime="2025-10-14 14:04:19.979749872 +0000 UTC m=+3489.223909031" watchObservedRunningTime="2025-10-14 14:04:19.984046599 +0000 UTC m=+3489.228205758" Oct 14 14:04:20.794342 master-2 kubenswrapper[4762]: I1014 14:04:20.794267 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-g87gn_b4c2af67-dbd9-4cd1-8214-2579c1851f1e/nmstate-handler/0.log" Oct 14 14:04:20.934179 master-2 kubenswrapper[4762]: I1014 14:04:20.934085 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/controller/0.log" Oct 14 14:04:22.039938 master-2 kubenswrapper[4762]: I1014 14:04:22.039888 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/frr/0.log" Oct 14 14:04:22.053741 master-2 kubenswrapper[4762]: I1014 14:04:22.053576 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/reloader/0.log" Oct 14 14:04:22.073506 master-2 kubenswrapper[4762]: I1014 14:04:22.073455 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/frr-metrics/0.log" Oct 14 14:04:22.089088 master-2 kubenswrapper[4762]: I1014 14:04:22.089034 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/kube-rbac-proxy/0.log" Oct 14 14:04:22.106074 master-2 kubenswrapper[4762]: I1014 14:04:22.106027 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/kube-rbac-proxy-frr/0.log" Oct 14 14:04:22.119814 master-2 kubenswrapper[4762]: I1014 14:04:22.119767 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-frr-files/0.log" Oct 14 14:04:22.133704 master-2 kubenswrapper[4762]: I1014 14:04:22.133619 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-reloader/0.log" Oct 14 14:04:22.152865 master-2 kubenswrapper[4762]: I1014 14:04:22.152801 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-metrics/0.log" Oct 14 14:04:23.183725 master-2 kubenswrapper[4762]: I1014 14:04:23.183587 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-guard-master-2_1b6a1dbe-f753-4c92-8b36-47517010f2f3/guard/0.log" Oct 14 14:04:24.193229 master-2 kubenswrapper[4762]: I1014 14:04:24.193178 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-65687bc9c8-twgxt_6a9723af-f63a-45cd-9456-b4d67c4d778a/oauth-openshift/0.log" Oct 14 14:04:24.298821 master-2 kubenswrapper[4762]: I1014 14:04:24.298772 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcdctl/0.log" Oct 14 14:04:24.561505 master-2 kubenswrapper[4762]: I1014 14:04:24.561411 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd/0.log" Oct 14 14:04:24.587662 master-2 kubenswrapper[4762]: I1014 14:04:24.587596 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-metrics/0.log" Oct 14 14:04:24.626008 master-2 kubenswrapper[4762]: I1014 14:04:24.625964 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-readyz/0.log" Oct 14 14:04:24.658076 master-2 kubenswrapper[4762]: I1014 14:04:24.658032 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-rev/0.log" Oct 14 14:04:24.691635 master-2 kubenswrapper[4762]: I1014 14:04:24.691276 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/setup/0.log" Oct 14 14:04:24.720216 master-2 kubenswrapper[4762]: I1014 14:04:24.720171 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-ensure-env-vars/0.log" Oct 14 14:04:24.748878 master-2 kubenswrapper[4762]: I1014 14:04:24.748833 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-resources-copy/0.log" Oct 14 14:04:24.968628 master-2 kubenswrapper[4762]: I1014 14:04:24.968552 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-10-master-2_e8558a2f-5ea7-42a3-b00d-2ffbb553f642/installer/0.log" Oct 14 14:04:25.121246 master-2 kubenswrapper[4762]: I1014 14:04:25.121201 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_revision-pruner-10-master-2_a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe/pruner/0.log" Oct 14 14:04:25.396447 master-2 kubenswrapper[4762]: I1014 14:04:25.394665 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r"] Oct 14 14:04:25.396447 master-2 kubenswrapper[4762]: I1014 14:04:25.396020 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.404239 master-2 kubenswrapper[4762]: I1014 14:04:25.402673 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r"] Oct 14 14:04:25.467090 master-2 kubenswrapper[4762]: I1014 14:04:25.467041 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-lib-modules\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.467090 master-2 kubenswrapper[4762]: I1014 14:04:25.467097 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-podres\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.467384 master-2 kubenswrapper[4762]: I1014 14:04:25.467162 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-sys\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.467384 master-2 kubenswrapper[4762]: I1014 14:04:25.467228 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrgt\" (UniqueName: \"kubernetes.io/projected/5df313d0-9a53-4e2c-8418-9619ff513ddd-kube-api-access-jcrgt\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.467384 master-2 kubenswrapper[4762]: I1014 14:04:25.467265 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-proc\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568451 master-2 kubenswrapper[4762]: I1014 14:04:25.568390 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-sys\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568685 master-2 kubenswrapper[4762]: I1014 14:04:25.568472 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrgt\" (UniqueName: \"kubernetes.io/projected/5df313d0-9a53-4e2c-8418-9619ff513ddd-kube-api-access-jcrgt\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568685 master-2 kubenswrapper[4762]: I1014 14:04:25.568510 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-proc\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568685 master-2 kubenswrapper[4762]: I1014 14:04:25.568557 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-lib-modules\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568685 master-2 kubenswrapper[4762]: I1014 14:04:25.568585 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-podres\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568924 master-2 kubenswrapper[4762]: I1014 14:04:25.568746 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-podres\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.568924 master-2 kubenswrapper[4762]: I1014 14:04:25.568786 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-sys\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.569230 master-2 kubenswrapper[4762]: I1014 14:04:25.569047 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-proc\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.569230 master-2 kubenswrapper[4762]: I1014 14:04:25.569169 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5df313d0-9a53-4e2c-8418-9619ff513ddd-lib-modules\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.588756 master-2 kubenswrapper[4762]: I1014 14:04:25.588711 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrgt\" (UniqueName: \"kubernetes.io/projected/5df313d0-9a53-4e2c-8418-9619ff513ddd-kube-api-access-jcrgt\") pod \"perf-node-gather-daemonset-qpb7r\" (UID: \"5df313d0-9a53-4e2c-8418-9619ff513ddd\") " pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:25.717492 master-2 kubenswrapper[4762]: I1014 14:04:25.717419 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:26.156886 master-2 kubenswrapper[4762]: I1014 14:04:26.156839 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kp26f_49013d18-7b86-4d86-ac2a-54a004e15932/speaker/0.log" Oct 14 14:04:26.354681 master-2 kubenswrapper[4762]: I1014 14:04:26.354376 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kp26f_49013d18-7b86-4d86-ac2a-54a004e15932/kube-rbac-proxy/0.log" Oct 14 14:04:26.364550 master-2 kubenswrapper[4762]: W1014 14:04:26.364497 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5df313d0_9a53_4e2c_8418_9619ff513ddd.slice/crio-a9857e99b0efdbd674dbbb60d7f640e8f94bf1e6f347bbc604838bdcf477301b WatchSource:0}: Error finding container a9857e99b0efdbd674dbbb60d7f640e8f94bf1e6f347bbc604838bdcf477301b: Status 404 returned error can't find the container with id a9857e99b0efdbd674dbbb60d7f640e8f94bf1e6f347bbc604838bdcf477301b Oct 14 14:04:26.366613 master-2 kubenswrapper[4762]: I1014 14:04:26.366302 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r"] Oct 14 14:04:26.622934 master-2 kubenswrapper[4762]: I1014 14:04:26.622592 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ddb89f76-887cs_f82e0c58-e2a3-491a-bf03-ad47b38c5833/router/3.log" Oct 14 14:04:26.638738 master-2 kubenswrapper[4762]: I1014 14:04:26.638644 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ddb89f76-887cs_f82e0c58-e2a3-491a-bf03-ad47b38c5833/router/2.log" Oct 14 14:04:27.004627 master-2 kubenswrapper[4762]: I1014 14:04:27.004549 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" event={"ID":"5df313d0-9a53-4e2c-8418-9619ff513ddd","Type":"ContainerStarted","Data":"a9857e99b0efdbd674dbbb60d7f640e8f94bf1e6f347bbc604838bdcf477301b"} Oct 14 14:04:27.187737 master-2 kubenswrapper[4762]: I1014 14:04:27.187659 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-rdrrq"] Oct 14 14:04:27.189219 master-2 kubenswrapper[4762]: I1014 14:04:27.189194 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.304052 master-2 kubenswrapper[4762]: I1014 14:04:27.303871 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.304358 master-2 kubenswrapper[4762]: I1014 14:04:27.304077 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728lm\" (UniqueName: \"kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.406772 master-2 kubenswrapper[4762]: I1014 14:04:27.406695 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.407037 master-2 kubenswrapper[4762]: I1014 14:04:27.406787 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728lm\" (UniqueName: \"kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.407037 master-2 kubenswrapper[4762]: I1014 14:04:27.406866 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.427505 master-2 kubenswrapper[4762]: I1014 14:04:27.427442 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728lm\" (UniqueName: \"kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm\") pod \"master-2-debug-rdrrq\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.514747 master-2 kubenswrapper[4762]: I1014 14:04:27.514689 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:27.538552 master-2 kubenswrapper[4762]: W1014 14:04:27.538499 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f0a829_149f_41ce_bb2a_f9e229882752.slice/crio-25395f255f0273538f13b3d6c92434515f251601e573ccab9c3db7345b292b37 WatchSource:0}: Error finding container 25395f255f0273538f13b3d6c92434515f251601e573ccab9c3db7345b292b37: Status 404 returned error can't find the container with id 25395f255f0273538f13b3d6c92434515f251601e573ccab9c3db7345b292b37 Oct 14 14:04:27.869524 master-2 kubenswrapper[4762]: I1014 14:04:27.869456 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7b6784d654-8vpmp_7f841a64-d2fd-44ed-b3e6-acdc127cacfc/oauth-apiserver/0.log" Oct 14 14:04:27.899005 master-2 kubenswrapper[4762]: I1014 14:04:27.898939 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-7b6784d654-8vpmp_7f841a64-d2fd-44ed-b3e6-acdc127cacfc/fix-audit-permissions/0.log" Oct 14 14:04:28.015222 master-2 kubenswrapper[4762]: I1014 14:04:28.015130 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" event={"ID":"65f0a829-149f-41ce-bb2a-f9e229882752","Type":"ContainerStarted","Data":"25395f255f0273538f13b3d6c92434515f251601e573ccab9c3db7345b292b37"} Oct 14 14:04:28.018500 master-2 kubenswrapper[4762]: I1014 14:04:28.018450 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" event={"ID":"5df313d0-9a53-4e2c-8418-9619ff513ddd","Type":"ContainerStarted","Data":"c24369ce63c8ff537ce08bbd80b922f51f831ca61a450dde506c71297a42235d"} Oct 14 14:04:28.019660 master-2 kubenswrapper[4762]: I1014 14:04:28.019616 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:28.046949 master-2 kubenswrapper[4762]: I1014 14:04:28.046859 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" podStartSLOduration=3.046835764 podStartE2EDuration="3.046835764s" podCreationTimestamp="2025-10-14 14:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-14 14:04:28.042111933 +0000 UTC m=+3497.286271092" watchObservedRunningTime="2025-10-14 14:04:28.046835764 +0000 UTC m=+3497.290994913" Oct 14 14:04:30.037203 master-2 kubenswrapper[4762]: I1014 14:04:30.033099 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/cluster-cloud-controller-manager/0.log" Oct 14 14:04:30.061958 master-2 kubenswrapper[4762]: I1014 14:04:30.061851 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/config-sync-controllers/0.log" Oct 14 14:04:30.063229 master-2 kubenswrapper[4762]: I1014 14:04:30.063187 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/config-sync-controllers/1.log" Oct 14 14:04:30.082541 master-2 kubenswrapper[4762]: I1014 14:04:30.082483 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/4.log" Oct 14 14:04:30.086066 master-2 kubenswrapper[4762]: I1014 14:04:30.086033 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-779749f859-bscv5_18346e46-a062-4e0d-b90a-c05646a46c7e/kube-rbac-proxy/5.log" Oct 14 14:04:35.760470 master-2 kubenswrapper[4762]: I1014 14:04:35.760300 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zqxwl/perf-node-gather-daemonset-qpb7r" Oct 14 14:04:36.145225 master-2 kubenswrapper[4762]: I1014 14:04:36.144980 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6768b5f5f9-6l8p6_7d29d094-ce27-46ec-a556-0129526c1103/console-operator/0.log" Oct 14 14:04:37.112176 master-2 kubenswrapper[4762]: I1014 14:04:37.112059 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" event={"ID":"65f0a829-149f-41ce-bb2a-f9e229882752","Type":"ContainerStarted","Data":"f3b74175c94434a005bf0d94c29f3a8b5851caf1851af3ae1ca4a48a8e015cbb"} Oct 14 14:04:37.692754 master-2 kubenswrapper[4762]: I1014 14:04:37.692650 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" podStartSLOduration=1.573070173 podStartE2EDuration="10.692630282s" podCreationTimestamp="2025-10-14 14:04:27 +0000 UTC" firstStartedPulling="2025-10-14 14:04:27.540284426 +0000 UTC m=+3496.784443585" lastFinishedPulling="2025-10-14 14:04:36.659844535 +0000 UTC m=+3505.904003694" observedRunningTime="2025-10-14 14:04:37.691384463 +0000 UTC m=+3506.935543622" watchObservedRunningTime="2025-10-14 14:04:37.692630282 +0000 UTC m=+3506.936789441" Oct 14 14:04:38.518903 master-2 kubenswrapper[4762]: I1014 14:04:38.518831 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-65bb9777fc-sd822_09d92233-a8b3-458a-8c27-f62e982a9d90/download-server/0.log" Oct 14 14:04:39.590769 master-2 kubenswrapper[4762]: I1014 14:04:39.590700 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-ddd7d64cd-hph6v_0500c75a-3460-4279-a8d8-cebf242e6089/snapshot-controller/1.log" Oct 14 14:04:39.591878 master-2 kubenswrapper[4762]: I1014 14:04:39.591833 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-ddd7d64cd-hph6v_0500c75a-3460-4279-a8d8-cebf242e6089/snapshot-controller/0.log" Oct 14 14:04:41.161710 master-2 kubenswrapper[4762]: I1014 14:04:41.161665 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbtld_1a953a2b-9bc4-485a-9daf-f6e9b84d493a/dns/0.log" Oct 14 14:04:41.189882 master-2 kubenswrapper[4762]: I1014 14:04:41.189820 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbtld_1a953a2b-9bc4-485a-9daf-f6e9b84d493a/kube-rbac-proxy/0.log" Oct 14 14:04:41.327913 master-2 kubenswrapper[4762]: I1014 14:04:41.327872 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rrjr_d4509bb5-afbe-43d9-bfe9-32cd7f55257d/dns-node-resolver/0.log" Oct 14 14:04:42.910588 master-2 kubenswrapper[4762]: I1014 14:04:42.910531 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-guard-master-2_1b6a1dbe-f753-4c92-8b36-47517010f2f3/guard/0.log" Oct 14 14:04:43.861770 master-2 kubenswrapper[4762]: I1014 14:04:43.861691 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcdctl/0.log" Oct 14 14:04:44.079256 master-2 kubenswrapper[4762]: I1014 14:04:44.079065 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd/0.log" Oct 14 14:04:44.101928 master-2 kubenswrapper[4762]: I1014 14:04:44.101882 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-metrics/0.log" Oct 14 14:04:44.128582 master-2 kubenswrapper[4762]: I1014 14:04:44.128459 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-readyz/0.log" Oct 14 14:04:44.174454 master-2 kubenswrapper[4762]: I1014 14:04:44.174398 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-rev/0.log" Oct 14 14:04:44.215208 master-2 kubenswrapper[4762]: I1014 14:04:44.215104 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/setup/0.log" Oct 14 14:04:44.248968 master-2 kubenswrapper[4762]: I1014 14:04:44.248904 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-ensure-env-vars/0.log" Oct 14 14:04:44.285001 master-2 kubenswrapper[4762]: I1014 14:04:44.284945 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-2_cd7826f9db5842f000a071fd58a1ae79/etcd-resources-copy/0.log" Oct 14 14:04:44.442549 master-2 kubenswrapper[4762]: I1014 14:04:44.442401 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-10-master-2_e8558a2f-5ea7-42a3-b00d-2ffbb553f642/installer/0.log" Oct 14 14:04:44.595248 master-2 kubenswrapper[4762]: I1014 14:04:44.595175 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_revision-pruner-10-master-2_a0b3b3d9-4cd5-4cf5-93b6-9480f7636efe/pruner/0.log" Oct 14 14:04:44.835757 master-2 kubenswrapper[4762]: I1014 14:04:44.835603 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:44.843089 master-2 kubenswrapper[4762]: I1014 14:04:44.843040 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:44.852686 master-2 kubenswrapper[4762]: I1014 14:04:44.852615 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:44.983691 master-2 kubenswrapper[4762]: I1014 14:04:44.983633 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:44.983982 master-2 kubenswrapper[4762]: I1014 14:04:44.983748 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwv6j\" (UniqueName: \"kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:44.983982 master-2 kubenswrapper[4762]: I1014 14:04:44.983802 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.086646 master-2 kubenswrapper[4762]: I1014 14:04:45.086496 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.086646 master-2 kubenswrapper[4762]: I1014 14:04:45.086638 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwv6j\" (UniqueName: \"kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.087305 master-2 kubenswrapper[4762]: I1014 14:04:45.086677 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.087866 master-2 kubenswrapper[4762]: I1014 14:04:45.087839 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.088146 master-2 kubenswrapper[4762]: I1014 14:04:45.088121 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.108235 master-2 kubenswrapper[4762]: I1014 14:04:45.108140 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwv6j\" (UniqueName: \"kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j\") pod \"community-operators-hnrf8\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.345901 master-2 kubenswrapper[4762]: I1014 14:04:45.345770 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:45.950611 master-2 kubenswrapper[4762]: I1014 14:04:45.950578 4762 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:45.956248 master-2 kubenswrapper[4762]: W1014 14:04:45.956175 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd6b2c7_be85_4ba1_a907_6cfdabb77856.slice/crio-b335da92a74e3569da6163e411bf230a4e025830e0f7681fe458ec4e681c4677 WatchSource:0}: Error finding container b335da92a74e3569da6163e411bf230a4e025830e0f7681fe458ec4e681c4677: Status 404 returned error can't find the container with id b335da92a74e3569da6163e411bf230a4e025830e0f7681fe458ec4e681c4677 Oct 14 14:04:46.200949 master-2 kubenswrapper[4762]: I1014 14:04:46.200818 4762 generic.go:334] "Generic (PLEG): container finished" podID="6dd6b2c7-be85-4ba1-a907-6cfdabb77856" containerID="4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65" exitCode=0 Oct 14 14:04:46.200949 master-2 kubenswrapper[4762]: I1014 14:04:46.200871 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerDied","Data":"4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65"} Oct 14 14:04:46.200949 master-2 kubenswrapper[4762]: I1014 14:04:46.200902 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerStarted","Data":"b335da92a74e3569da6163e411bf230a4e025830e0f7681fe458ec4e681c4677"} Oct 14 14:04:46.403943 master-2 kubenswrapper[4762]: I1014 14:04:46.403908 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8fg56_b14dba7b-829d-48e2-a0bb-9eef2303a088/node-ca/0.log" Oct 14 14:04:47.215114 master-2 kubenswrapper[4762]: I1014 14:04:47.214971 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerStarted","Data":"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd"} Oct 14 14:04:47.217297 master-2 kubenswrapper[4762]: I1014 14:04:47.217259 4762 generic.go:334] "Generic (PLEG): container finished" podID="65f0a829-149f-41ce-bb2a-f9e229882752" containerID="f3b74175c94434a005bf0d94c29f3a8b5851caf1851af3ae1ca4a48a8e015cbb" exitCode=0 Oct 14 14:04:47.217421 master-2 kubenswrapper[4762]: I1014 14:04:47.217360 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" event={"ID":"65f0a829-149f-41ce-bb2a-f9e229882752","Type":"ContainerDied","Data":"f3b74175c94434a005bf0d94c29f3a8b5851caf1851af3ae1ca4a48a8e015cbb"} Oct 14 14:04:48.041931 master-2 kubenswrapper[4762]: I1014 14:04:48.041880 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2c8tn_bacf3bb6-fd05-4a71-943c-522e7e8ce76e/serve-healthcheck-canary/0.log" Oct 14 14:04:48.226841 master-2 kubenswrapper[4762]: I1014 14:04:48.226774 4762 generic.go:334] "Generic (PLEG): container finished" podID="6dd6b2c7-be85-4ba1-a907-6cfdabb77856" containerID="172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd" exitCode=0 Oct 14 14:04:48.228414 master-2 kubenswrapper[4762]: I1014 14:04:48.228364 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerDied","Data":"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd"} Oct 14 14:04:48.312309 master-2 kubenswrapper[4762]: I1014 14:04:48.312259 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:48.459621 master-2 kubenswrapper[4762]: I1014 14:04:48.459570 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host\") pod \"65f0a829-149f-41ce-bb2a-f9e229882752\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " Oct 14 14:04:48.459882 master-2 kubenswrapper[4762]: I1014 14:04:48.459665 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-728lm\" (UniqueName: \"kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm\") pod \"65f0a829-149f-41ce-bb2a-f9e229882752\" (UID: \"65f0a829-149f-41ce-bb2a-f9e229882752\") " Oct 14 14:04:48.459882 master-2 kubenswrapper[4762]: I1014 14:04:48.459820 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host" (OuterVolumeSpecName: "host") pod "65f0a829-149f-41ce-bb2a-f9e229882752" (UID: "65f0a829-149f-41ce-bb2a-f9e229882752"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:04:48.460140 master-2 kubenswrapper[4762]: I1014 14:04:48.460108 4762 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65f0a829-149f-41ce-bb2a-f9e229882752-host\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:48.463440 master-2 kubenswrapper[4762]: I1014 14:04:48.463378 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm" (OuterVolumeSpecName: "kube-api-access-728lm") pod "65f0a829-149f-41ce-bb2a-f9e229882752" (UID: "65f0a829-149f-41ce-bb2a-f9e229882752"). InnerVolumeSpecName "kube-api-access-728lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:04:48.562148 master-2 kubenswrapper[4762]: I1014 14:04:48.562096 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-728lm\" (UniqueName: \"kubernetes.io/projected/65f0a829-149f-41ce-bb2a-f9e229882752-kube-api-access-728lm\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:48.655527 master-2 kubenswrapper[4762]: I1014 14:04:48.655458 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-rdrrq"] Oct 14 14:04:48.662493 master-2 kubenswrapper[4762]: I1014 14:04:48.662437 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-rdrrq"] Oct 14 14:04:49.238425 master-2 kubenswrapper[4762]: I1014 14:04:49.238379 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25395f255f0273538f13b3d6c92434515f251601e573ccab9c3db7345b292b37" Oct 14 14:04:49.239493 master-2 kubenswrapper[4762]: I1014 14:04:49.238433 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-rdrrq" Oct 14 14:04:49.560920 master-2 kubenswrapper[4762]: I1014 14:04:49.560804 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f0a829-149f-41ce-bb2a-f9e229882752" path="/var/lib/kubelet/pods/65f0a829-149f-41ce-bb2a-f9e229882752/volumes" Oct 14 14:04:50.248014 master-2 kubenswrapper[4762]: I1014 14:04:50.247954 4762 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-4pjkb"] Oct 14 14:04:50.248749 master-2 kubenswrapper[4762]: E1014 14:04:50.248387 4762 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f0a829-149f-41ce-bb2a-f9e229882752" containerName="container-00" Oct 14 14:04:50.248749 master-2 kubenswrapper[4762]: I1014 14:04:50.248405 4762 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f0a829-149f-41ce-bb2a-f9e229882752" containerName="container-00" Oct 14 14:04:50.248749 master-2 kubenswrapper[4762]: I1014 14:04:50.248715 4762 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f0a829-149f-41ce-bb2a-f9e229882752" containerName="container-00" Oct 14 14:04:50.249584 master-2 kubenswrapper[4762]: I1014 14:04:50.249542 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerStarted","Data":"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635"} Oct 14 14:04:50.249699 master-2 kubenswrapper[4762]: I1014 14:04:50.249655 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.288786 master-2 kubenswrapper[4762]: I1014 14:04:50.288678 4762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hnrf8" podStartSLOduration=2.797056614 podStartE2EDuration="6.288649273s" podCreationTimestamp="2025-10-14 14:04:44 +0000 UTC" firstStartedPulling="2025-10-14 14:04:46.203236038 +0000 UTC m=+3515.447395197" lastFinishedPulling="2025-10-14 14:04:49.694828697 +0000 UTC m=+3518.938987856" observedRunningTime="2025-10-14 14:04:50.282097115 +0000 UTC m=+3519.526256294" watchObservedRunningTime="2025-10-14 14:04:50.288649273 +0000 UTC m=+3519.532808452" Oct 14 14:04:50.405516 master-2 kubenswrapper[4762]: I1014 14:04:50.405424 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.406236 master-2 kubenswrapper[4762]: I1014 14:04:50.406172 4762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp5gp\" (UniqueName: \"kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.511988 master-2 kubenswrapper[4762]: I1014 14:04:50.508618 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.511988 master-2 kubenswrapper[4762]: I1014 14:04:50.508806 4762 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp5gp\" (UniqueName: \"kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.511988 master-2 kubenswrapper[4762]: I1014 14:04:50.509330 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.540267 master-2 kubenswrapper[4762]: I1014 14:04:50.536769 4762 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp5gp\" (UniqueName: \"kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp\") pod \"master-2-debug-4pjkb\" (UID: \"d605067e-476b-448d-a650-597790282807\") " pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.570514 master-2 kubenswrapper[4762]: I1014 14:04:50.569900 4762 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:50.606788 master-2 kubenswrapper[4762]: W1014 14:04:50.606733 4762 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd605067e_476b_448d_a650_597790282807.slice/crio-ae9a1016396f36f989447c2f62c7af1f9bb9340b492ed5db6878c44788ad32c5 WatchSource:0}: Error finding container ae9a1016396f36f989447c2f62c7af1f9bb9340b492ed5db6878c44788ad32c5: Status 404 returned error can't find the container with id ae9a1016396f36f989447c2f62c7af1f9bb9340b492ed5db6878c44788ad32c5 Oct 14 14:04:51.259952 master-2 kubenswrapper[4762]: I1014 14:04:51.259877 4762 generic.go:334] "Generic (PLEG): container finished" podID="d605067e-476b-448d-a650-597790282807" containerID="5d66d67ab9b7c8089a67a1ae6a9b6cb2bcb35ce1b294ecc3d3a708780ebd89f5" exitCode=1 Oct 14 14:04:51.261292 master-2 kubenswrapper[4762]: I1014 14:04:51.261248 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" event={"ID":"d605067e-476b-448d-a650-597790282807","Type":"ContainerDied","Data":"5d66d67ab9b7c8089a67a1ae6a9b6cb2bcb35ce1b294ecc3d3a708780ebd89f5"} Oct 14 14:04:51.261370 master-2 kubenswrapper[4762]: I1014 14:04:51.261300 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" event={"ID":"d605067e-476b-448d-a650-597790282807","Type":"ContainerStarted","Data":"ae9a1016396f36f989447c2f62c7af1f9bb9340b492ed5db6878c44788ad32c5"} Oct 14 14:04:52.359140 master-2 kubenswrapper[4762]: I1014 14:04:52.359070 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:52.374290 master-2 kubenswrapper[4762]: I1014 14:04:52.374217 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-4pjkb"] Oct 14 14:04:52.383699 master-2 kubenswrapper[4762]: I1014 14:04:52.383644 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zqxwl/master-2-debug-4pjkb"] Oct 14 14:04:52.422704 master-2 kubenswrapper[4762]: I1014 14:04:52.422635 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/alertmanager/0.log" Oct 14 14:04:52.447720 master-2 kubenswrapper[4762]: I1014 14:04:52.447655 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host\") pod \"d605067e-476b-448d-a650-597790282807\" (UID: \"d605067e-476b-448d-a650-597790282807\") " Oct 14 14:04:52.448045 master-2 kubenswrapper[4762]: I1014 14:04:52.447882 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host" (OuterVolumeSpecName: "host") pod "d605067e-476b-448d-a650-597790282807" (UID: "d605067e-476b-448d-a650-597790282807"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 14 14:04:52.448308 master-2 kubenswrapper[4762]: I1014 14:04:52.448271 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp5gp\" (UniqueName: \"kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp\") pod \"d605067e-476b-448d-a650-597790282807\" (UID: \"d605067e-476b-448d-a650-597790282807\") " Oct 14 14:04:52.449358 master-2 kubenswrapper[4762]: I1014 14:04:52.449317 4762 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d605067e-476b-448d-a650-597790282807-host\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:52.451094 master-2 kubenswrapper[4762]: I1014 14:04:52.451041 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp" (OuterVolumeSpecName: "kube-api-access-tp5gp") pod "d605067e-476b-448d-a650-597790282807" (UID: "d605067e-476b-448d-a650-597790282807"). InnerVolumeSpecName "kube-api-access-tp5gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:04:52.551922 master-2 kubenswrapper[4762]: I1014 14:04:52.551757 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp5gp\" (UniqueName: \"kubernetes.io/projected/d605067e-476b-448d-a650-597790282807-kube-api-access-tp5gp\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:52.752699 master-2 kubenswrapper[4762]: I1014 14:04:52.751195 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/config-reloader/0.log" Oct 14 14:04:52.825870 master-2 kubenswrapper[4762]: I1014 14:04:52.825724 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/kube-rbac-proxy-web/0.log" Oct 14 14:04:52.874283 master-2 kubenswrapper[4762]: I1014 14:04:52.874216 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/kube-rbac-proxy/0.log" Oct 14 14:04:52.901925 master-2 kubenswrapper[4762]: I1014 14:04:52.901868 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/kube-rbac-proxy-metric/0.log" Oct 14 14:04:52.937254 master-2 kubenswrapper[4762]: I1014 14:04:52.937193 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/prom-label-proxy/0.log" Oct 14 14:04:52.967820 master-2 kubenswrapper[4762]: I1014 14:04:52.967779 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_294de25b-260e-4ac0-89f0-a08608e56eca/init-config-reloader/0.log" Oct 14 14:04:53.286182 master-2 kubenswrapper[4762]: I1014 14:04:53.286120 4762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9a1016396f36f989447c2f62c7af1f9bb9340b492ed5db6878c44788ad32c5" Oct 14 14:04:53.286509 master-2 kubenswrapper[4762]: I1014 14:04:53.286225 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zqxwl/master-2-debug-4pjkb" Oct 14 14:04:53.306264 master-2 kubenswrapper[4762]: I1014 14:04:53.306212 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-57fbd47578-96mh2_02c9abb4-532d-4831-b69c-7445bfe51494/kube-state-metrics/0.log" Oct 14 14:04:53.340930 master-2 kubenswrapper[4762]: I1014 14:04:53.340880 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-57fbd47578-96mh2_02c9abb4-532d-4831-b69c-7445bfe51494/kube-rbac-proxy-main/0.log" Oct 14 14:04:53.369105 master-2 kubenswrapper[4762]: I1014 14:04:53.369054 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-57fbd47578-96mh2_02c9abb4-532d-4831-b69c-7445bfe51494/kube-rbac-proxy-self/0.log" Oct 14 14:04:53.398813 master-2 kubenswrapper[4762]: I1014 14:04:53.398756 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-76c4979bdc-gds6w_632a0df2-e17d-483d-8a41-914ac73e0782/metrics-server/0.log" Oct 14 14:04:53.456534 master-2 kubenswrapper[4762]: I1014 14:04:53.456472 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-75bcf9f5fd-5f2qh_269169cd-d1e1-47b2-926e-ef8c684424bb/monitoring-plugin/0.log" Oct 14 14:04:53.560672 master-2 kubenswrapper[4762]: I1014 14:04:53.560500 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d605067e-476b-448d-a650-597790282807" path="/var/lib/kubelet/pods/d605067e-476b-448d-a650-597790282807/volumes" Oct 14 14:04:53.608833 master-2 kubenswrapper[4762]: I1014 14:04:53.608766 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc698_c7f87a74-3d2e-4e1e-a564-4957c50f5b20/node-exporter/0.log" Oct 14 14:04:53.651894 master-2 kubenswrapper[4762]: I1014 14:04:53.651842 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc698_c7f87a74-3d2e-4e1e-a564-4957c50f5b20/kube-rbac-proxy/0.log" Oct 14 14:04:53.688438 master-2 kubenswrapper[4762]: I1014 14:04:53.688354 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc698_c7f87a74-3d2e-4e1e-a564-4957c50f5b20/init-textfile/0.log" Oct 14 14:04:53.782831 master-2 kubenswrapper[4762]: I1014 14:04:53.782768 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-56d8dcb55c-h25c4_da6c08c6-be65-4110-a7e7-b7d5477ae716/kube-rbac-proxy-main/0.log" Oct 14 14:04:53.807923 master-2 kubenswrapper[4762]: I1014 14:04:53.807838 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-56d8dcb55c-h25c4_da6c08c6-be65-4110-a7e7-b7d5477ae716/kube-rbac-proxy-self/0.log" Oct 14 14:04:53.836531 master-2 kubenswrapper[4762]: I1014 14:04:53.836387 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-56d8dcb55c-h25c4_da6c08c6-be65-4110-a7e7-b7d5477ae716/openshift-state-metrics/0.log" Oct 14 14:04:53.879232 master-2 kubenswrapper[4762]: I1014 14:04:53.879144 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/prometheus/0.log" Oct 14 14:04:53.971444 master-2 kubenswrapper[4762]: I1014 14:04:53.969260 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/config-reloader/0.log" Oct 14 14:04:53.996909 master-2 kubenswrapper[4762]: I1014 14:04:53.996846 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/thanos-sidecar/0.log" Oct 14 14:04:54.112274 master-2 kubenswrapper[4762]: I1014 14:04:54.112117 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/kube-rbac-proxy-web/0.log" Oct 14 14:04:54.159805 master-2 kubenswrapper[4762]: I1014 14:04:54.159755 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/kube-rbac-proxy/0.log" Oct 14 14:04:54.303302 master-2 kubenswrapper[4762]: I1014 14:04:54.299225 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/kube-rbac-proxy-thanos/0.log" Oct 14 14:04:54.403605 master-2 kubenswrapper[4762]: I1014 14:04:54.403458 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_9c6adb4c-bde4-41b5-a5e7-225aa1d7ef80/init-config-reloader/0.log" Oct 14 14:04:55.346366 master-2 kubenswrapper[4762]: I1014 14:04:55.346257 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:55.346366 master-2 kubenswrapper[4762]: I1014 14:04:55.346359 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:55.400676 master-2 kubenswrapper[4762]: I1014 14:04:55.400568 4762 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:55.867729 master-2 kubenswrapper[4762]: I1014 14:04:55.867667 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-574d7f8db8-gbr5b_34749ef7-edb0-466b-a317-bb788dc5b851/prometheus-operator/0.log" Oct 14 14:04:55.885166 master-2 kubenswrapper[4762]: I1014 14:04:55.885108 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-574d7f8db8-gbr5b_34749ef7-edb0-466b-a317-bb788dc5b851/kube-rbac-proxy/0.log" Oct 14 14:04:55.947657 master-2 kubenswrapper[4762]: I1014 14:04:55.947596 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-79d5f95f5c-btmxj_abcbfb46-51e9-40ed-8f92-415d25d25b53/prometheus-operator-admission-webhook/0.log" Oct 14 14:04:55.982118 master-2 kubenswrapper[4762]: I1014 14:04:55.982069 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c4f9c4b6-s6gwn_97348241-e1d9-4dd5-bcaa-762088570022/telemeter-client/0.log" Oct 14 14:04:56.004327 master-2 kubenswrapper[4762]: I1014 14:04:56.004283 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c4f9c4b6-s6gwn_97348241-e1d9-4dd5-bcaa-762088570022/reload/0.log" Oct 14 14:04:56.028628 master-2 kubenswrapper[4762]: I1014 14:04:56.028563 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c4f9c4b6-s6gwn_97348241-e1d9-4dd5-bcaa-762088570022/kube-rbac-proxy/0.log" Oct 14 14:04:56.260526 master-2 kubenswrapper[4762]: I1014 14:04:56.260467 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/thanos-query/0.log" Oct 14 14:04:56.313767 master-2 kubenswrapper[4762]: I1014 14:04:56.313707 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/kube-rbac-proxy-web/0.log" Oct 14 14:04:56.333661 master-2 kubenswrapper[4762]: I1014 14:04:56.333588 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/kube-rbac-proxy/0.log" Oct 14 14:04:56.364251 master-2 kubenswrapper[4762]: I1014 14:04:56.364066 4762 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:56.366946 master-2 kubenswrapper[4762]: I1014 14:04:56.366090 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/prom-label-proxy/0.log" Oct 14 14:04:56.417861 master-2 kubenswrapper[4762]: I1014 14:04:56.417811 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/kube-rbac-proxy-rules/0.log" Oct 14 14:04:56.447228 master-2 kubenswrapper[4762]: I1014 14:04:56.447186 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cc99494f6-kmmxc_12625659-53e4-4ae4-837b-f5178cfe2681/kube-rbac-proxy-metrics/0.log" Oct 14 14:04:56.462434 master-2 kubenswrapper[4762]: I1014 14:04:56.462366 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:58.337525 master-2 kubenswrapper[4762]: I1014 14:04:58.337474 4762 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hnrf8" podUID="6dd6b2c7-be85-4ba1-a907-6cfdabb77856" containerName="registry-server" containerID="cri-o://9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635" gracePeriod=2 Oct 14 14:04:58.648273 master-2 kubenswrapper[4762]: I1014 14:04:58.646630 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/controller/0.log" Oct 14 14:04:58.898520 master-2 kubenswrapper[4762]: I1014 14:04:58.898397 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:58.990116 master-2 kubenswrapper[4762]: I1014 14:04:58.990073 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities\") pod \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " Oct 14 14:04:58.990461 master-2 kubenswrapper[4762]: I1014 14:04:58.990444 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content\") pod \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " Oct 14 14:04:58.990775 master-2 kubenswrapper[4762]: I1014 14:04:58.990759 4762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwv6j\" (UniqueName: \"kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j\") pod \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\" (UID: \"6dd6b2c7-be85-4ba1-a907-6cfdabb77856\") " Oct 14 14:04:58.991328 master-2 kubenswrapper[4762]: I1014 14:04:58.990777 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities" (OuterVolumeSpecName: "utilities") pod "6dd6b2c7-be85-4ba1-a907-6cfdabb77856" (UID: "6dd6b2c7-be85-4ba1-a907-6cfdabb77856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:04:58.994424 master-2 kubenswrapper[4762]: I1014 14:04:58.994398 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j" (OuterVolumeSpecName: "kube-api-access-gwv6j") pod "6dd6b2c7-be85-4ba1-a907-6cfdabb77856" (UID: "6dd6b2c7-be85-4ba1-a907-6cfdabb77856"). InnerVolumeSpecName "kube-api-access-gwv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 14 14:04:59.063412 master-2 kubenswrapper[4762]: I1014 14:04:59.063347 4762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dd6b2c7-be85-4ba1-a907-6cfdabb77856" (UID: "6dd6b2c7-be85-4ba1-a907-6cfdabb77856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 14 14:04:59.096222 master-2 kubenswrapper[4762]: I1014 14:04:59.096139 4762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwv6j\" (UniqueName: \"kubernetes.io/projected/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-kube-api-access-gwv6j\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:59.096222 master-2 kubenswrapper[4762]: I1014 14:04:59.096219 4762 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-utilities\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:59.096222 master-2 kubenswrapper[4762]: I1014 14:04:59.096235 4762 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd6b2c7-be85-4ba1-a907-6cfdabb77856-catalog-content\") on node \"master-2\" DevicePath \"\"" Oct 14 14:04:59.391053 master-2 kubenswrapper[4762]: I1014 14:04:59.390975 4762 generic.go:334] "Generic (PLEG): container finished" podID="6dd6b2c7-be85-4ba1-a907-6cfdabb77856" containerID="9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635" exitCode=0 Oct 14 14:04:59.391882 master-2 kubenswrapper[4762]: I1014 14:04:59.391074 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerDied","Data":"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635"} Oct 14 14:04:59.391882 master-2 kubenswrapper[4762]: I1014 14:04:59.391119 4762 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnrf8" event={"ID":"6dd6b2c7-be85-4ba1-a907-6cfdabb77856","Type":"ContainerDied","Data":"b335da92a74e3569da6163e411bf230a4e025830e0f7681fe458ec4e681c4677"} Oct 14 14:04:59.391882 master-2 kubenswrapper[4762]: I1014 14:04:59.391190 4762 scope.go:117] "RemoveContainer" containerID="9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635" Oct 14 14:04:59.391882 master-2 kubenswrapper[4762]: I1014 14:04:59.391485 4762 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnrf8" Oct 14 14:04:59.420910 master-2 kubenswrapper[4762]: I1014 14:04:59.420858 4762 scope.go:117] "RemoveContainer" containerID="172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd" Oct 14 14:04:59.447818 master-2 kubenswrapper[4762]: I1014 14:04:59.447619 4762 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:59.450104 master-2 kubenswrapper[4762]: I1014 14:04:59.450041 4762 scope.go:117] "RemoveContainer" containerID="4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65" Oct 14 14:04:59.456657 master-2 kubenswrapper[4762]: I1014 14:04:59.456586 4762 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hnrf8"] Oct 14 14:04:59.489795 master-2 kubenswrapper[4762]: I1014 14:04:59.489673 4762 scope.go:117] "RemoveContainer" containerID="9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635" Oct 14 14:04:59.491072 master-2 kubenswrapper[4762]: E1014 14:04:59.490977 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635\": container with ID starting with 9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635 not found: ID does not exist" containerID="9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635" Oct 14 14:04:59.491072 master-2 kubenswrapper[4762]: I1014 14:04:59.491014 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635"} err="failed to get container status \"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635\": rpc error: code = NotFound desc = could not find container \"9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635\": container with ID starting with 9ba7c49d5ea2858f8f270f6ad1ef96bab2da9e21efbe13a0743b185d446ed635 not found: ID does not exist" Oct 14 14:04:59.491072 master-2 kubenswrapper[4762]: I1014 14:04:59.491038 4762 scope.go:117] "RemoveContainer" containerID="172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd" Oct 14 14:04:59.491422 master-2 kubenswrapper[4762]: E1014 14:04:59.491270 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd\": container with ID starting with 172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd not found: ID does not exist" containerID="172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd" Oct 14 14:04:59.491422 master-2 kubenswrapper[4762]: I1014 14:04:59.491291 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd"} err="failed to get container status \"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd\": rpc error: code = NotFound desc = could not find container \"172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd\": container with ID starting with 172321c0ac50bdceab2ec6c77595d5adc44834c2c1098b07c01ddb3b20cd6dbd not found: ID does not exist" Oct 14 14:04:59.491422 master-2 kubenswrapper[4762]: I1014 14:04:59.491303 4762 scope.go:117] "RemoveContainer" containerID="4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65" Oct 14 14:04:59.491732 master-2 kubenswrapper[4762]: E1014 14:04:59.491636 4762 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65\": container with ID starting with 4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65 not found: ID does not exist" containerID="4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65" Oct 14 14:04:59.491732 master-2 kubenswrapper[4762]: I1014 14:04:59.491668 4762 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65"} err="failed to get container status \"4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65\": rpc error: code = NotFound desc = could not find container \"4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65\": container with ID starting with 4ba24b879e6c64032f6475012d6fb2a3f2115353e8f5fd3ef4232a48deb40c65 not found: ID does not exist" Oct 14 14:04:59.562130 master-2 kubenswrapper[4762]: I1014 14:04:59.562066 4762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd6b2c7-be85-4ba1-a907-6cfdabb77856" path="/var/lib/kubelet/pods/6dd6b2c7-be85-4ba1-a907-6cfdabb77856/volumes" Oct 14 14:04:59.621220 master-2 kubenswrapper[4762]: I1014 14:04:59.619292 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/frr/0.log" Oct 14 14:04:59.648289 master-2 kubenswrapper[4762]: I1014 14:04:59.647867 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/reloader/0.log" Oct 14 14:04:59.677196 master-2 kubenswrapper[4762]: I1014 14:04:59.676664 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/frr-metrics/0.log" Oct 14 14:04:59.708374 master-2 kubenswrapper[4762]: I1014 14:04:59.708087 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/kube-rbac-proxy/0.log" Oct 14 14:04:59.736195 master-2 kubenswrapper[4762]: I1014 14:04:59.736127 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/kube-rbac-proxy-frr/0.log" Oct 14 14:04:59.766006 master-2 kubenswrapper[4762]: I1014 14:04:59.765936 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-frr-files/0.log" Oct 14 14:04:59.797037 master-2 kubenswrapper[4762]: I1014 14:04:59.796977 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-reloader/0.log" Oct 14 14:04:59.825802 master-2 kubenswrapper[4762]: I1014 14:04:59.825749 4762 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2pxml_3f514207-4fde-4312-bc50-75fda0edcdfe/cp-metrics/0.log"